Don't say Hello to Chatgpt

Why saying “hello” to ChatGPT might not be a good idea

Last week, I caught myself typing “Hello, ChatGPT” out of habit—until a friend pointed out the hidden costs of that simple greeting. It turns out, every extra word of courtesy adds up in unexpected ways, both for your wallet and the planet.

Politeness comes at a price

It feels natural to start a conversation with “hello” or end it with “thank you,” even when talking to an AI. But each additional word of politeness carries a computational toll. For instance, in April, a user on platform X asked Sam Altman, CEO of OpenAI, about the true expense of being courteous to ChatGPT. Altman revealed that simple niceties cost tens of millions of dollars in electricity every year. According to Palisade Research, that energy demand stems from the servers working overtime to process even the smallest emotional or polite cues.

I remember experimenting with ChatGPT one evening, firing off casual “thanks” after each prompt, until I realized how quickly those responses piled up on my own electricity bill. Beyond individual quirks, a study by Future (the owner of TechRadar) found that around two-thirds of Americans say “please” and “thank you” to AI systems. About 55% do it because it’s second nature, while 12% hope that politeness will serve them well “when robots rise up.” The rest—20%—skip etiquette to save time, and 13% omit it because “it’s just a machine.”

Despite the ecological impact, The New York Times argues that maintaining polite exchanges with AI may have broader cultural benefits. Neil Johnson, a physics professor at George Washington University, told the Times that treating ChatGPT with respect can spill over into our human interactions. In other words, if you practice courtesy with a chatbot, you might find yourself more courteous in real-life conversations, too.

See also  This job pays €8000 in Switzerland — but only €2500 in France

ChatGPT is not your therapist

Beyond pleasantries, some users treat ChatGPT like a confidant—venting personal struggles or seeking mental health advice. Though the AI can offer empathetic-sounding responses, experts warn that it isn’t trained for psychotherapy and can struggle in crisis situations. The Independent reports that relying on ChatGPT for emotional support can be dangerous, especially if someone assumes it can replace a trained professional.

In fact, researchers at the Georgia Institute of Technology discovered that when ChatGPT is fed traumatic narratives, its replies become vague, generic, or even inconsistent—technical signs of “stress” rather than emotional distress. For engineers and psychologists alike, this highlights the AI’s lack of true empathy. OpenAI itself admits in a report that while users sometimes seek emotional support, ChatGPT should not stand in for a therapist. Though swapping stories with a chatbot might bring temporary comfort, the system remains a set of algorithms—no more capable of genuine emotional care than a calculator can perform surgery.

As AI continues expanding into everyday life—from coding assistants to wellness check-ins—we must remember its boundaries. Energy consumption, ecological impact, and the risks of misplaced trust remind us that even small habits, like saying “hello,” carry weight. And when it comes to mental health, the safest approach is still to seek help from trained professionals, not a chatbot.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top