Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • pyre@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    30 days ago

    it won’t if you don’t force it to. that’s like saying companies will pollute less if you give them enough time. no, you have to grab their balls and force them to do it.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      30 days ago

      I think it’s fair to say that pretty much every industry is more efficient and cleaner than it used to be and I don’t see why AI would be an exception to that.

      • pyre@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        30 days ago

        i think you’re not thinking about what efficiency means for corporations.

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          29 days ago

          I think it’s exactly what I’m thinking about, unless I’m missing something specific that you’d like to put forward?

          If I own a bottled drinks company and the energy cost is 10p a bottle but a new, more efficient process is invented that would lower my energy cost to 5p a bottle, that’s going to be looking like a wise investment to make. A few pence over several thousand products adds up pretty quickly.

          I could either pocket the difference as extra profit, lower my unit price to the consumer to make my product more competitive in the market, or a bit of both.

          • areyouevenreal@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            25 days ago

            It seems basic logic like this doesn’t actually work on these people. Most really can’t get their heads around the fact that energy costs money and companies want to use less of it wherever possible and practical to do so.

        • areyouevenreal@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          29 days ago

          Mainly because energy and data centers are both expensive and companies want to use as little as possible of both - especially on the energy side. OpenAI isn’t exactly profitable. There is a reason companies like Microsoft release smaller models like Phi-2 that can be run on individual devices rather than data centers.

          • frayedpickles@lemmy.cafe
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            I didn’t realize coal plants were concerned about data centers or AI. TIL.

            But in the interest of being slightly less of a dick and responding to what you said even though it’s kinda a non sequitur, companies are only vaguely interested in efficiency. I think it’s more accurate to say that AI is hot for everyone right now so there’s more eyes on it which makes the concept you laid out valid. Where it’s invalid in my experience is that efficiency is just based on “where x executive is paying attention” not an honest attempt to look at return on investment in a rigorous way across the enterprise.

            • areyouevenreal@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              25 days ago

              I didn’t realize coal plants were concerned about data centers or AI. TIL.

              What? How does that relate to anything I just said?

              But in the interest of being slightly less of a dick and responding to what you said even though it’s kinda a non sequitur, companies are only vaguely interested in efficiency.

              How is it a non sequitur? If anything the thing you just said makes no sense. Energy is probably the biggest cost these companies have. This I believe is true even for regular data centers and cloud services which is why they always try to use the latest most energy efficient hardware. It’s still not as bad as most anti-AI people seem to believe, mainly because the most energy intensive part happens only once per model (training).

              I think it’s more accurate to say that AI is hot for everyone right now so there’s more eyes on it which makes the concept you laid out valid. Where it’s invalid in my experience is that efficiency is just based on “where x executive is paying attention” not an honest attempt to look at return on investment in a rigorous way across the enterprise.

              Human labour is expensive. So trying to replace it with AI, even if AI is also expensive, is typically still worth it.

              You talk about experience, but I honestly don’t think you have any. Do you actually work in tech? What are your qualifications? Most of the people coming here to complain about this stuff don’t actually have a functional understanding of the thing they are complaining about.