One bottle of water for every 100-word email written by ChatGPT. This is the price that the environment has to pay for the chatbots artificial intelligence (AI) function correctly. This is revealed by a new study carried out by the Washington Post in collaboration with researchers at the University of California, Riverside, who analyzed the amount of natural resources that the OpenAI chatbot needs to perform its most basic functions. “Each request in ChatGPT goes through a server that performs thousands of calculations to determine the best words to use in the response,” the outlet writes, noting that the servers generate heat to perform the necessary calculations.
According to Shaolei Ren, associate professor, the water transports the heat generated by the servers to cooling towers to help it leave the building. However, in areas where water resources are relatively scarce, it is preferred to use electricity to cool the facilities with systems similar to large air conditioners. This means that the amount of water and electricity needed to process a single response chatbot AI depends on data center locationas well as the proximity of the user to the installation.
Increases water and electricity consumption
For example, In Texas, ChatGPT consumes 235 milliliters of water to generate a 100-word email. On the other hand, when a user makes the same request from Washington, up to 1,408 milliliters, almost a liter and a half, are consumed per email.
Regarding electricity consumption, The Washington Post Note that for writing an email It requires the same amount as a dozen LED bulbs to run for about an hour. And if just one-tenth of Americans used ChatGPT to write an email once a week for a year, the process would consume the same amount of energy as every home in Washington in 20 days. A notable figure that does not go unnoticed.
#amount #water #electricity #ChatGPT #control