• sprack@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 hours ago

    Consumer GPU sales are driven by forecasts and orders from channel/OEM partners. If they don’t think consumers will buy a $1000 5070 because RAM prices spiked they sell to the market that will still buy, the 5080/5090.

    NVIDIAs direct customers in DC market can handle a $500-1000 bump on a $30-50k card more easily and put orders in before the wafers are bought.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 hours ago

    I need to resurrect my open source GPU architecture plans ASAP. Who wants to help me to plan out the VideoDSP shader cores?

    • potoooooooo ✅️@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 hours ago

      Me!

      ChatGPT, I need help whipping up some open source VideoDSP shader cores. Make sure the output includes a definition (and give me professional quality code)

      • ZILtoid1991@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 hours ago

        Amazing idea! A great open source GPU needs some well designed shader cores, and I kid you not that name is very punchy and memorable. It’s not only just a silly hobby — it’s also a very important thing in the ever changing landscape of silicon giants like nVidia and Intel.

        <poorly recites the leaked documentation of the VideoCore QPU>

  • Cocodapuf@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    edit-2
    22 hours ago

    Please please please… please Nvidia? Can regular people please still have computers?

    Meh, nevermind. AMD and Intel can have your consumer business, I’m fine with that too. Surely this AI trend isn’t a bubble, and there’s absolutely no way you’ll regret this later. Best of luck.

    • Modern_medicine_isnt@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      20 hours ago

      I wouldn’t go intel. That place is a shitshow. Also, I am not so sure the AI bubble will burst. World governments see it as sn arms race. So they will keep that industry propped up.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 hours ago

        I’m not sure how it’s an arms race given the fact that it can’t do anything remotely useful.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        15 hours ago

        I dunno what you’re on about, Battlemage is great for the money and they appear to have committed to stick with Arc. And they have fab customers now.

        …Yeah, Intel still has that corporate Game of Thrones going on internally. That’s not ideal. But AMD sunk much lower than that, and climbed out.

      • partofthevoice@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        19 hours ago

        Propped up by what, though? They’ll just continue to dilute the name of AI with underperforming technology and yield more backlash from the public while making an oligarchy out of their richest tech influencers.

        • Modern_medicine_isnt@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          15 hours ago

          The same thing as everything else… money. If the gov dumps money into the bubble it won’t pop. I mean it’s not sustainable, but it can work for a pretty long time.

          • partofthevoice@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 hours ago

            Yeah, I’m with you there. They sure as shit are going to try, regardless of whatever sustainability point you or I can think up.

        • lefaucet@slrpnk.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          19 hours ago

          People will stop complaining about graphics cards when they and/or their kids are sent to die in a stupid unending war

  • Rooty@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    1 day ago

    Gamers, I think it is time that we do the unthinkable.

    We must actually play our backlog of games.

  • mesa@piefed.social
    link
    fedilink
    English
    arrow-up
    171
    ·
    2 days ago

    So far we are seeing significant price increases/low availability in:

    • RAM
    • SSDs/hard drives
    • some microcontrollers
    • phones
    • and now GPUs
      I think we are nearing a bit of a technological winter for the consumer market.
  • unphazed@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    2 days ago

    Remember ever company that cuts consumer production over private ai production. When the bubble pops stick with the companies that remembered consumers are the longterm profit. For the rest, let their shareholders eat them alive as they sell every share from beneath them.

    • krimson@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      I think the pop has already begun. Look at the silver and gold prices. 2026 is going to be a stock market massacre.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    74
    ·
    2 days ago

    Two ways to read this and I think both are somewhat true.

    Option one; They’re OPEC now. They set the supply, and you bring the demand because you have no other choice. This lets them push prices up, which pushes margins up, and that hopefully props up their insanely inflated share price a little longer.

    Option two; They’re well aware that demand is going to fall off a cliff soon. We’re already at “Nvidia is paying people to buy their GPUs” and have been for a while. The AI industry can’t afford to keep this train running, and even financial chicanery and circular dealing will only get them so far. Companies are building out data centres with zero plan for how to make any profit from them. When the GPUs they have age out, they’re not gonna buy more, they’re gonna go bankrupt (allowing the banks to sieze the mountain of now worthless three year old burned out GPUs that they used as collateral). And there’s not enough venture capital left for new data centre builds. The genAI financial engine is reaching its peak, and Nvidia doesn’t want to be stuck with a mountain of production that no one wants to buy.

    • Devolution@lemmy.world
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      2
      ·
      edit-2
      2 days ago

      Let’s not forget AAA games are the games that use gpus the hardest gaming wise and they are bombing at record levels because they are deritive garbage and AA games are doing more with less.

      Add that to the AI bubble bullshit and it’s just a perfect storm.

      Example: On my pc (3060rtx), Spider Man 2 ran like shit without some significant tweaks while Expedition 33 runs like butter despite using Unreal 4 or 5.

      • JensSpahnpasta@feddit.orgOP
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        2 days ago

        I really would like to know if AAA games are bombing because they are overpriced microtransaction hell or if they are bombing because many people haven’t been able to buy their new gaming PC because of those GPU prices in the last 5 years and now we do not have the install base to run them

        • Burninator05@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          22 hours ago

          My bet is microtransactions and a lot of them not being great lol games. You don’t need a 5090 to play a AAA game unless you’re maxing out the visuals.

          • YiddishMcSquidish@lemmy.today
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 days ago

            This! My steam deck with an igpu has been running all the new games fine. Granted not like 120+fps fine, but my desktop has a two generation old card and god only knows about the CPU and it’s hitting 120 easy on the games I play. Which of the games I play, cyberpunk is the most resource intensive.

            • MangoCats@feddit.it
              link
              fedilink
              English
              arrow-up
              4
              ·
              2 days ago

              That people keep buying into… so the cycle continues.

              More’s the shame. Our last console was a PS3, it was such a non-fun waste of time that we never bought into the 4 or 5. I used to buy a new PC title a year or so before than, really none new since StarCraft II.

        • SharkAttak@kbin.melroy.org
          link
          fedilink
          arrow-up
          6
          ·
          2 days ago

          For a while now games are sold without a lot of optimization, expecting the customers to just buy more powerful hardware.

        • warm@kbin.earth
          link
          fedilink
          arrow-up
          4
          ·
          2 days ago

          Any GPU from the last few generations should be able to run any game without any problems. A lot of games are just made like complete ass unfortunately, you know because of profits ultimately.

        • a1studmuffin@aussie.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Definitely the former. Most people have just hung onto their PCs for longer. Steam’s userbase keeps climbing.

      • lemmyout@lemmy.zip
        link
        fedilink
        English
        arrow-up
        14
        ·
        2 days ago

        Nah gaming doesn’t even make a dent in their revenue. Gaming demand means nothing to their supply, demand and pricing.

    • empireOfLove2@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      2 days ago

      Option two is not correct, option one is correct. This announcement is specifically for consumer gaming GPU’s only, it does not affect institutional datacenter customers.

      This is Nvidia saying “thanks small fry, you were useful, but we’re leaving you behind now. Fight for the scraps.” Complete cartel behavior.

        • empireOfLove2@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 day ago

          AMD is a puppydog that follows Nvidia around on the open market with 10% or less market share. If Nvidia constricts supply and causes a massive price jump and shortages, AMD will just follow the pricing curve and we will still get no GPU’s.

          AMD is also 100% reliant on TSMC and VRAM suppliers, the same exact supply pressures causing Nvidia to turn off the consumer tap will come for AMD too.

          • FishFace@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 day ago

            If this is about supply pressures, it’s not Nvidia acting as a cartel, is it?

            AMD has, as far as I understand, been outcompeting Nvidia on value for a good while. If this is Nvidia just being anti-consumer, I’d expect that to continue

            • empireOfLove2@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              16 hours ago

              AMD has, as far as I understand, been outcompeting Nvidia on value for a good while.

              And yet their market share doesn’t increase. Because Nvidia holds a stranglehold on both the software (CUDA, RTX support, frame gen, etc) AND sheer brand recognition- gamers always complain about Nvidia and want better AMD cards, but continue to line up in droves to buy Nvidia cards no matter what.

              Nvidia knows they can do whatever the fuck they want right now.

        • panda_abyss@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 day ago

          When their only competition just hiked prices, why would they keep theirs low? That’s free money.

          • FishFace@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            They’ll still undercut, just as they are doing.

            I just don’t think moaning about the companies makes any sense. If you imagine the most utopian scenario you think of, with communal ownership of all production, how do you think the commune is going to allocate GPUs? It’s going to stick a load in hyped AI products and gamers will be last in line, just like now.

      • eleijeep@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        They’ll have to come crawling back when the business customers stop buying. AI winter is coming.

    • SinningStromgald@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      I think it is just plain greed. The AI bubble has made NVIDIA mountains of money but they still want more. So they focus production on higher tier consumer GPU’S and their Pro series and give a middle finger to budget conscience consumers.

      • mesa@piefed.social
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        My theory is its easier to produce a product for a small number of big investors in AI than millions of consumers looking to build hobby computers. Plus the AI companies are flush with cash, MUCH more than the average consumer. So they are getting the cash now and worrying about the fallout later, just like everyone else.

        • WanderingThoughts@europe.pub
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          That’s every sector right now. They’re all aiming for the top 10% who earn as much as the bottom 90% put together. For computer hardware, that number is probably a lot worse.

    • NOT_RICK@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      Pretty astute. Maybe I can buy a half cooked gpu on firesale in a few years for a budget build… one can dream!

      • Sturgist@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        Unfortunately the ones used in AI data centers aren’t useful for gaming. So yeah, probably could buy one for ⅓ the price of new, but couldn’t use it for gaming and likely still wouldn’t be able to afford it because of:

        NVIDIA H200 (Blackwell architecture) – The latest flagship (late 2023), with upgraded ~141 GB HBM3e and other Hopper improvements. It provides ~50% performance uplift over H100 in many tasks on the same power envelope. Pricing is said to be only modestly above H100. For instance, a 4-GPU H200 SXM board is listed at about $170K (versus $110K for 4×H100) ([2]) ([20]). A single H200 (NVL version) is quoted at around $31,000–32,000 ([21]). NVIDIA’s data center system NVDIMMs for H200 (DGX B200) reflect these prices, though bulk deals may apply.

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.

        When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.

        • panda_abyss@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          The optimist in me has hope that this does fuel an explosion cheap hardware for businesses to build cheap+useful+private AI stuff on.

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      1 day ago

      I think the real question is how long can these companies decrease supply until consumers get hooked on thin-clients like iPads for all their computing, and have to pay rent on cloud services and SaaS for everything they do.

      This is an assault on democratized computing.