We all know how AI powered systems give us answers within seconds. They do so by churning through tons of data to provide knowledge in an instant. However, ChatGPT also uses a lot of water!
According to a report on IndiaTimes, researchers from the Universities of Texas Arlington and Colorado Riverside have discovered that a lot of water is used to keep the data centres for these AI products cool. The publication also quoted a study titled ‘Making AI Less Thirsty’ in which it was stated that it required almost 7,00,301 Litres of water just to train GPT-3 alone!Surprisingly and shockingly so, this is the same amount of water required to cool a nuclear reactor! This water that is used to cool the data centres, is essential to keep these systems up and running.
While on the one hand GPT can answer relevant questions in an instant, it needs to ‘drink’ the equivalent of a 500ml bottle to answer a regular conversation of 20 to 50 questions.
“While a 500ml bottle of water might not seem too much, the total combined water footprint for inference is still extremely large, considering ChatGPT’s billions of users,” the authors noted. With such humongous amounts of water being used, AI companies need to address their own “water footprint “.
One must also not that the water used in these data centres is not ‘lost forever’. However, they cannot use any form of water as they need water that is free of any contamination to prevent corrosion and damage. Also, the water comes from rivers, lakes, and other freshwater sources. And there’s a difference between “withdrawal” (physically taking water from a source) and “consumption” (when the water is lost through evaporation in the data center). The water ‘consumed’ is not lost forever and gets released into the air through cooling towers, although it takes some time to return as rain.
This calls for a serious conversation about the “environmental” sustainability of AI and AI powered systems.