Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
that’s not really a big factor.
Yeah because you’ve measured the water intake and export of every large body of water I forgot you’re obviously an expert who knows how to read when a data center takes more water than a town, love your stern optimism, maybe like, wander off somewhere else so you feel important in your views, because it ain’t with me here bud
no, you obviously don’t want people to talk to you. that’s fair.