Using Polite Language

Using Polite Language is Costing ChatGPT Millions of Dollars, Says Altman: Artificial Intelligence Trends

Oh, please! 😉 OpenAI CEO Sam Altman says that using polite language like “please” and “thank you” to ChatGPT is costing them millions of dollars.

According to Futurism (Sam Altman Admits That Saying “Please” and “Thank You” to ChatGPT Is Wasting Millions of Dollars in Computing Power, written by Joe Wilkins and available here), when one poster on X-formerly-Twitter wondered aloud ”how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models,” Altman chimed in, saying it’s “tens of millions of dollars well spent.”

While it may seem pointless to treat an AI chatbot with respect, some AI architects say it’s an important move. Microsoft’s design manager Kurtis Beavers, for example, says proper etiquette ”helps generate respectful, collaborative outputs.”

Advertisement
Veracity Forensics

“Using polite language sets a tone for the response,” Beavers notes. The argument can certainly be made; what we consider “artificial intelligence” might more accurately be described as “prediction machines,” like your phone’s predictive text, but with more autonomy to spit out complete sentences in response to questions or instructions.

“When it clocks politeness, it’s more likely to be polite back,” a Microsoft WorkLab memo notes. “Generative AI also mirrors the levels of professionalism, clarity, and detail in the prompts you provide.”

late 2024 survey found that 67 percent of US respondents reported being nice to their chatbots. Of those who practice courtesy, 55 percent of American AI users said they do it “because it’s the right thing to do,” while 12 percent did it to appease the algorithm in the case of an AI uprising! 🤣

Is using polite language with your chatbot really costing a lot? Apparently, anything we do with them is costing – the environment.

Advertisement
Syllo

One Washington Post investigation, done in collaboration with researchers at the University of California, studied the impacts of generating a 100-word email. They found that just one email requires .14 kilowatt-hours worth of electricity, or enough to power 14 LED lights for an hour. If you were to send one AI email a week over the course of a year, you’d use an eye-watering 7.5kWh, roughly equal to an hour’s worth of electricity consumed by 9 households in Washington DC.

The author’s answer to that dilemma is to say: “if you’re mulling whether or not to thank Grok for its efforts, maybe the better move would be to ditch the chatbot and write the email yourself.”

Yeah, sure, that’ll happen. 😉

So, what do you think? Do you say “please” and “thank you” to your AI chatbots? Please share any comments you might have or if you’d like to know more about a particular topic.

Image created using ChatGPT 4o using DALL-E, using the term “robot lawyer holding the door open for another robot lawyer entering an office building”.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

One comment

Leave a Reply