Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
If it’s truly a closed loop, why do you need a lake, a true closed loop has zero need for local water sources, else there’s some sort of negative that they’re compensating for which, in case of local water sources, there’s not enough infrastructure if any of that water OR HEAT leaves the system faster than it enters
to be the other end of the heat exchanger?