Being Polite to ChatGPT Costs OpenAI Tens of Millions in Electricity, Says Sam Altman

BigGo Editorial Team
Being Polite to ChatGPT Costs OpenAI Tens of Millions in Electricity, Says Sam Altman

In the world of artificial intelligence, even simple courtesies come with a price tag. While many users naturally incorporate polite phrases when interacting with AI assistants, this human tendency toward politeness is having unexpected financial consequences for the companies behind these technologies.

The Cost of Digital Courtesy

OpenAI CEO Sam Altman recently revealed that users saying please and thank you to ChatGPT has cost the company tens of millions of dollars in operational expenses. The revelation came in response to a user's question on X (formerly Twitter) about whether polite expressions directed at AI models are wasteful. While Altman's response maintained a lighthearted tone—Tens of millions of dollars well spent — you never know—it highlights the significant resource consumption behind even the most trivial AI interactions.

Why Politeness Costs Money

Every interaction with ChatGPT, including seemingly inconsequential expressions of courtesy, requires the AI to process additional tokens and generate responses in real time. These extra words increase computational load and energy consumption across OpenAI's data centers. According to Goldman Sachs, each ChatGPT-4 query uses approximately ten times more electricity than a standard Google search, making those extra tokens of politeness add up quickly across millions of daily interactions.

AI Energy Consumption Facts

  • ChatGPT-4 query: ~10x more electricity than a standard Google search
  • Current AI share of US power consumption: 4%
  • Projected AI share of US power consumption by 2030: 25%
  • Water usage: Generating 100 words with GPT-4 consumes up to three bottles of water
  • Even a 3-word response ("You are welcome") uses ~1.5 ounces of water
  • Data centers currently account for ~2% of global electricity consumption

Environmental Impact Beyond Dollars

The cost isn't just financial. AI systems rely heavily on energy-intensive data centers that already account for approximately 2% of global electricity consumption. A study from the University of California, Riverside found that using GPT-4 to generate 100 words consumes up to three bottles of water for cooling servers, with even a simple three-word response like You are welcome using about 1.5 ounces of water.

The Growing Energy Footprint of AI

The energy demands of AI are projected to increase dramatically in coming years. Rene Hass, CEO of semiconductor company ARM Holdings, has warned that AI could account for 25% of America's total power consumption by 2030—a significant jump from the current 4%. Data from the Washington Post suggests that if just one in ten working Americans used GPT-4 once weekly for a year, the power needed would be comparable to the electricity consumed by every household in Washington, D.C. for 20 days.

The Value of Digital Etiquette

Despite the costs, there may be benefits to maintaining polite interactions with AI systems. Kurt Beavers, design director of Microsoft Copilot, suggests that using polite language helps set the tone for AI responses, with the system mirroring the courtesy shown to it. This reciprocal politeness can potentially shape more meaningful and natural interactions between humans and AI assistants.

Finding Balance in Human-AI Communication

As AI becomes more integrated into daily life, users face an interesting dilemma: balancing natural, human-like communication styles with awareness of the resources consumed by these interactions. While Altman's comment suggests OpenAI considers the expense worthwhile, it raises questions about sustainable practices in human-AI communication as these technologies continue to scale.

For now, OpenAI seems to view the cost of digital courtesy as an investment in more natural human-AI relationships—even if that investment comes with a multimillion-dollar price tag.