• cogman@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    4
    ·
    2 days ago

    And I’ll bet roughly 50->90% of that usage is idiots doing crypto/ai garbage.

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      7
      ·
      2 days ago

      No lol the amount of power that cloud services use is atrocious. The serverless trend makes things add up. It’s cheaper in terms of hardware but oh boy do all those layers of abstraction make things heavy, especially loading of the applet, communication to other applets. (I’m forgetting if applets is the correct name).

      I actually don’t belive the ~5% figure at all—especially with the track record of honesty (or the lack of it) these companies have.

      Sure, datacenters are rather efficent, but times 330 million people in the US adds up.

      • cogman@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        1
        ·
        edit-2
        2 days ago

        The amount of power AI and Crypto require is orders of magnitude the amount of power required by pretty much any regular application. The company I work at uses somewhere around 2000 CPU cores worth of compute at AWS (and we have ~100 microservices. We are a fairly complex org that way).

        Generally speaking, an 80CPU core system takes up ~200W worth of power. That means my companies entire fleet operating eats about 5kW of power when running full bore (it isn’t doing that all the time). My company is not a small company.

        Compare that to what a single nvidia A100 eats up. Those GPUs take up to 400W of power. When doing AI/crypto stuff you are running them as hard as possible (meaning you are eating the full 400W). That means just 12 AI or crypto apps will eat all the same amount of power that my company with 100 different applications eats while running full bore. Now imagine that with the model training of someone like chatgpt which can eat pretty much as many GPUs as you can throw at it.

        To put all of this in perspective. 5kW is roughly what a minisplit system will consume.

        Frankly, I’m way more concerned about my companies travel budget in terms of CO2 emissions than I am our datacenter usage.

      • ddh@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I would bet that hardware being way more efficient and corporate IT infrastructure being consolidated in data centers is much more energy efficient than the alternative. The fact that we are running much more layered and compute-intensive systems doesn’t really change that.