ChatGPT uses 1/2 liter of water for every 20 questions

OpenAI acknowledged the water usage issue and said it was looking for ways to make LLM more energy efficient.

According to researcher Shaolei Ren of the University of California, ChatGPT and similar language models use up to 500 milliliters of water for every 20 to 50 questions a user asks.

In a paper published in Arxiv earlier this year, researchers pointed out that 500 ml may not seem like a lot. But if users globally are combined, this consumes a huge amount of water.

“It could be said that most of the growth [air] [dalam laporan lingkungan hidup Microsoft tahun 2022] caused by AI. Its major investment in generative AI and partnership with OpenAI. “Most people don’t know the underlying resource usage of ChatGPT,” Shaolei Ren told AP News, quoted from Neowin (11/9).

Microsoft has acknowledged the problem and said it is researching ways to measure its energy use and carbon footprint. He also said they are looking for ways to make LLM more energy efficient.

Likewise, OpenAI also acknowledged the water usage issue and said it was looking for ways to make LLM more energy efficient. Hopefully, companies can develop the methods currently used to reduce water and energy use.

For your information, water use in data center power infrastructure is commonplace. Even companies like Google and Microsoft still use cooling towers as a cooling solution for warehouse-scale data centers.

The cooling system ensures the server does not overheat. Computer servers are where AI models are trained and deployed for inference. These servers may also have custom designs, including multiple GPUs and/or custom-built hardware, to speed AI model training and inference.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *