Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
A condenser can be as simple as a glass dome in a cool room. There is no need for any electricity or heat.
Pretty sure the glass dome traps the heat they’re trying to dissipate.
It’s literally there to let evaporated water cool and become liquid again. 🤦♂️
And what happens to the heat? Heat can’t just magically disappear which means water can’t cool without heat being able to dissipate somewhere. So it would have to dissipate heat into the dome. What happens to the dome if you keep pumping hot vapor into the dome? It heats up. If it heats up the water vapor stops cooling and the entire cooling system stops working.
I’m not saying it couldn’t work in theory, I’m saying it doesn’t work in practice because the dome would have insanely big, maybe the size of small nation big.
Note your use of the word “cool.”