• db0@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    12 hours ago

    At least anecdotally, Andreas over at 82MHz.net tried running a AI model locally on his laptop and it took over 10 minutes for just one prompt.

    OK just the 4th sentence clearly shows this person has no clue what they’re talking about.

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 hours ago

      Yep, clueless. I stopped reading at that point. For the audience, large language models come in all sizes and you can run some small but useful ones fairly quickly even without a GPU. They keep getting more capable for the size as well. Remember the uproar about Deepseek R1? Well, progress hasn’t stopped.

      • db0@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        10 hours ago

        It’s not even that. It’s like trying to run an AAA game on a 10 year old laptop and complaining the game is garbage because your frame rates are too low.