- cross-posted to:
- fuck_ai@lemmy.world
- climate@slrpnk.net
- cross-posted to:
- fuck_ai@lemmy.world
- climate@slrpnk.net
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
Why does the article make it sound like cooling a data center results in constant water loss? Is this not a closed loop system?
I’m imagining a giant reservoir heat sink that runs throughout a complex to pull heat out of the surrounding environment where some liquid evaporates and needs to be replenished. But first of all we have more efficient liquid coolants, and second that would be a very lazy solution.
I wonder if they’ve considered geothermal for new data centers. You can run a geothermal loop in reverse and use the earth as a giant heat sink. It’s not water in the loop, it’s refrigerant, and it only needs to be replaced when you find the efficiency dropping, which can take decades.
Evaporative coolers save a ton of energy compared to refrigerator cycle closed loop systems. Like a swamp cooler, the hot liquid that comes from cooling the server is exposed to the atmosphere and enough evaporates off to cool the liquid by a decent percentage, then it’s refrigerated before going back into the servers.
Data centre near me is using it and the fire service is used to be being called by people concerned the huge clouds of water vapor are smoke
You need something to move the heat away, like water or air. Having something solid that just absorbs will reach its heat capacity pretty quick.
It highly depends on every data center, but it is very likely that they do use municipal water for cooling. Mainting a Reservoir is extremely expensive for the amount of thermal mass it requires, these things kick off HEAT.
I don’t know why they aren’t using reclaimed water from treatment plants. I don’t see why potable water is necessary as long as the substitute isn’t corrosive, but I might be missing something here.