Don’t Say Please And Thank You To ChatGPT – It Can Cost Millions

Posted: by Alvin Palmejar

image ofChatGPT alternative

If you’re in the habit of being polite to AI — tossing in a friendly “please” or “thank you” when chatting with ChatGPT — you might be surprised to learn it’s not just good manners; it’s expensive. According to OpenAI CEO Sam Altman, those small courtesies are adding up to a surprisingly high cost for the company.

Altman recently responded to a user on X (formerly Twitter) who asked whether politeness toward ChatGPT affected operating costs. His answer? Yes — and in a big way. He said that users’ tendency to pad their prompts with pleasantries is part of what’s costing OpenAI “tens of millions of dollars.” He even described the expense as “well spent,” though he left the comment open-ended with a cryptic “you never know.”

It may sound like a joke, but there’s a serious explanation behind it.

ChatGPT, like other AI chatbots, runs on massive neural networks called large language models (LLMs). These models are powered by an army of GPUs (graphics processing units), which are housed in energy-hungry data centers. Every interaction with ChatGPT — even short ones — requires significant computational power. Adding extra words, like “please” and “thank you,” means slightly more data to process, slightly more tokens to parse, and ultimately, slightly more power consumed per interaction.

Now multiply that by billions.

It’s estimated that a single prompt-response exchange can use up to 0.14 kilowatt-hours of energy — the same amount of electricity needed to keep 14 LED lightbulbs on for an hour. When scaled across millions (if not billions) of daily interactions, those little extras compound into a major energy burden.

Globally, data centers already account for about 2% of electricity use. With the explosion of AI tools and growing demand for services like ChatGPT, energy use is poised to rise even further. As people engage with these systems more frequently — and more politely — the invisible costs grow.

Still, not everyone agrees that politeness is a waste.

Some experts argue that saying “please” and “thank you” to AI isn’t just for show. Kurtis Beavers, part of Microsoft’s Copilot design team, advocates for respectful prompts. He says that when users phrase their questions politely, the AI is more likely to respond in a friendly, cooperative tone — an interaction style that could be valuable in workplace settings and other professional environments.

Microsoft’s WorkLab has echoed this sentiment, noting that “when it clocks politeness, it’s more likely to be polite back.” That means the tone you set with your words can influence the tone you receive — even from a machine.

A 2024 survey found that 67% of Americans regularly use courteous language with AI. Among them, most say it’s just the right thing to do. Interestingly, a small but vocal group (about 12%) admitted they use manners “just in case” — jokingly referencing a future where robots might remember who was nice to them.

But jokes aside, politeness toward AI is starting to raise questions. Is it worth the cost? Does being polite matter if the recipient isn’t a person? And can we afford to keep typing in extra words that may seem small but, on a global scale, carry a multi-million-dollar energy price tag?

Altman seems to think it’s a price worth paying — at least for now. Whether that sentiment holds as energy demands rise and environmental concerns mount remains to be seen.

So the next time you thank ChatGPT for helping with a recipe, resume, or research — just know that you’re participating in a modern dilemma: balancing digital etiquette with real-world impact.

Scroll to Top