• Ugurcan@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    9
    ·
    edit-2
    16 days ago

    I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

    Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

    And 2025’s investors doesn’t give a flying fuck about energy efficiency.

    • PostaL@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      16 days ago

      And they don’t want to disclose the energy efficiency becaaaause … ?

    • RobotZap10000@feddit.nl
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      16 days ago

      They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are lower.

    • Sl00k@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      It also has a very flexible “thinking” nature, which means far far less tokens spent on most peoples responses.