Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • Ziggurat@jlai.lu
    link
    fedilink
    arrow-up
    5
    ·
    5 days ago

    This is the complicated part with water consumption, saving water in the Netherlands won’t make rain in Morroco.

    However, there is only so many rain water stored in the ground at a given time and brought by the rivers. This water need to be used mostly for agriculture, then human consumption, and finally industry. Once it’s back in the cloud we don’t fully know where it will fall again, let alone if it’s polluted.

    Sure it’s a renewable ressource, the problem start when you the water faster than the rate at which it renews, especially during summer. In Europe the problem will be even worse with the global warming. The alpine glacier are disappearing meaning that we’ll loose a major water reserve for summer