Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • rumschlumpel@feddit.org
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    edit-2
    5 days ago

    At this point, I wouldn’t be surprised if it turned out that they’re destroying the environment on purpose for some nefarious purpose. e.g. maybe they think it’s easier to rule the masses if natural ressources are very scarce.

    • Melvin_Ferd@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      5 days ago

      I think it’s more likely that yellow journalism is making an issue out of something that isn’t as big a deal as it is.

        • Melvin_Ferd@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          10
          ·
          edit-2
          5 days ago

          When haven’t they?

          Fear and anger sell.

          This AI shit is the leftist version of “illegal immigrants are stealing yur jobs”

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            5 days ago

            To be fair, the “infinite scaling” vision Altman and such are selling is quite a dystopia. And they are the ones pushing it.

            It’s not reality at all. But it’s kinda reasonable for people to hate that specifically.

              • brucethemoose@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                5 days ago

                No.

                The path I see forward for ML is small, task specific models running on your smartphone or PC, with some kind of bitnet architecture so it uses basically no power.

                That’s the hope, anyway, but all the pieces already exist. Bitnet works, extreme task specific training works with a paper that just came out, NPU frameworks are starting to come together.

                If that sounds incompatible with corporate AI, that’s because it is.