Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
Why would they design around evaporative cooling when water consumption is a problem?
Because evaporative cooling is much cheaper and easier to accomplish at scale, and megacorps don’t care about long-term resource constraints until it begins to affect their wallets.
also places in red states allow for free or cheap polluting, and waste.
Because it’s cheap and easy.
because it’s cheap, easy, compact, well understood, and makes numbers look good. number in question is ratio of energy used by entire facility to energy used by silicon only (i forgor how it’s called). alternative is dissipating heat from radiators, but this makes this number like 3. evaporative cooling makes this number closer to 1.2
PUE (Power Usage Effectiveness).
sounds like we need to charge them more for water
Instead, WE are paying more for water and power to subsidize them.
Because they are assholes who hate the environment. The same reason they are using fossil fuels to power their slop centers instead of renewables.
most effective form of heat transfer
Because line must go up
it wasn’t a problem before they started doing this