MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
you
Recherche

Saying ‘Please and Thank You’ to ChatGPT is Burning Cash and Energy: Why It Matters

lundi 21 avril 2025, 22:56 , par eWeek
If you’ve ever typed “please” or “thank you” into ChatGPT, you’re not alone — and you might be driving up OpenAI’s power bill. Just ask OpenAI CEO and tech billionaire Sam Altman. When an X (formerly Twitter) user joked about how much electricity the company has burned through because of all that polite language in its prompts, Altman replied that it’s “tens of millions of dollars well spent.”

While Altman was likely joking about the cost, the energy use is no laughing matter. ChatGPT’s computational load uses massive amounts of energy in the data centers that power it. Goldman Sachs estimates that each query uses as much as 10 times as much energy as a Google search, with data centers accounting for around 2% of global energy consumption.

The cost of using AI

ARM Holdings semiconductor CEO Rene Hass recently warned that AI use could account for a quarter of the total power consumption in the U.S. by 2030. According to the Washington Post, if one in every 10 Americans use ChatGPT once a week for a year, the power consumption would equal that of every Washington D.C. household for 20 straight days. The Post also estimates that ChatGPT roughly one bottle of water to write a 100-word email, which means that even polite phrases like “please” and “thank you” contribute to water loss. 

Business Energy UK estimates that ChatGPT prompts consume enough electricity each day to power the Empire State Building for 18 months—enough to charge eight million phones. Put another way, it said, ChatGPT uses more electricity than each of the 117 lowest-consumption countries.

Should you bother to be polite to AI?

Even though chatbots don’t have feelings or experience emotions like appreciation, basic etiquette can significantly impact the quality of their responses, Ken Beavers, a director on the design team for Microsoft’s Copilot AI bot, noted in a Microsoft WorkLab blog. Polite language packs a punch.

“Using polite language sets the tone for AI responses,” Beavers explained. Human conversation is used to teach large language models (LLMs), which are also generative AI models. Just as email autocomplete predicts your next word from what you type, LLMs predict sentences or paragraphs similarly. When you’re polite, the AI is more likely to emulate that and respond in kind.

“When it clocks politeness, it’s more likely to be polite back,” Beavers said. He noted how generative AI also mirrors the levels of professionalism, clarity, and detail in your prompts. Conversely, if you use rude or provocative language, you’ll get a response similar in tone.

Rather than demand your chatbot do something, start your prompts with “please.” For example, “Please rewrite this more concisely; please suggest 10 ways to rebrand this product.” Say “thank you” when the chatbot responds and tell it you appreciate the help. This not only ensures you get the same graciousness in return but also improves the AI’s responsiveness and performance.

It’s also a good idea to practice this in your human interactions, the WorkLab blog observed. It seems there are many people who could use reminders: Some 67 percent of people who use AI are polite to it, according to a December 2024 survey.
The post Saying ‘Please and Thank You’ to ChatGPT is Burning Cash and Energy: Why It Matters appeared first on eWEEK.
https://www.eweek.com/news/being-polite-to-chatgpt-is-expensive/

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Date Actuelle
mar. 22 avril - 10:32 CEST