Generative “AI” data centers are gobbling up trillions of dollars in capital, not to mention heating up the planet like a microwave. As a result there’s a capacity crunch on memory production, shooting the prices for RAM sky high, over 100 percent in the last few months alone. Multiple stores are tired of adjusting the prices day to day, and won’t even display them. You find out how much it costs at checkout.

    • notabot@piefed.social
      link
      fedilink
      English
      arrow-up
      169
      ·
      3 days ago

      It wouldn’t be quite so bad if the previous gold rush ended first, but they seem to just be stacking up.

      • Truscape@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        46
        arrow-down
        1
        ·
        2 days ago

        Speak for your self - scored a nice GPU upgrade during the crypto crash, maybe something similar will be achievable after this insanity hits the brakes.

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            2 days ago

            Gaming GPUs during normal crypto markets don’t compute fast enough to mine crypto profitably, but if crypto prices get high enough such as during a boom cycle, it can become profitable to mine on gaming GPUs

            Edit to add: For crypto there’s basically a set speed that any given GPU mines at. The hash rate specifically. It really doesn’t change noticably over time through software updates, nor does the power consumption of the GPU. Its basically a set cost per cryptocurrency mined with any given hardware. If the value earned by mining can exceed the cost to run the GPU then GPU mining can quickly start making sense again.

              • Trainguyrom@reddthat.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 days ago

                Machine learning models have much different needs that crypto. Both run well on gaming GPUs and both run even better on much higher end GPUs, but ultimately machine learning models really really need fast memory because it loads the entire weights into graphics memory for processing. There’s some tools which will push it to system memory but these models are latency sensitive so crossing the CPU bus to pass 10s of gigabytes of data between the GPU and system memory is too much latency.

                Machine learning also has the aspect of training vs inference, where the training portion will take a long time, will take less time with more/faster compute and you simply can’t do anything with the model while it’s training, meanwhile inference is still compute heavy it doesn’t require anywhere near as much as the training phase. So organizations will typically rent as much hardware as possible for the training phase to try to get the model running as quickly as possible so they can move on to making money as quickly as possible.

                In terms of GPU availability this means they’re going to target high end GPUs, such as packing AI developer stations full of 4090s and whatever the heck Nvidia replaced the Tesla series with. Some of the new SOCs which have shared system/vram such as AMD’s and Apple’s new SOCs also fill a niche for AI developer and AI enthusiasts too since that enables large amounts of high speed video memory for relatively low cost. Realistically the biggest impact that AI is having on the Gaming GPU space is it’s changing the calculation that AMD, Nvidia and Intel are making when planning out their SKUs, so they’re likely being stingy on GPU memory specs for lower end GPUs to try to push anyone with specific AI models they’re looking to run to much more expensive GPUs

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 days ago

        There was a nice window from about a year or two ago to about 3 months ago where no individual components were noticably inflated. Monitors took the longest to recover since the pandemic shortages so that was arguably around the beginning of this year that they seemed to fully normalize

        Its funny because at work we’ve been pushing hard on Windows 11 refreshes all year and warning that there will likely be a rush of folks refreshing at the last possible minute at the end of the year inflating prices. And we ended up being correct on the inflated prices part but it was actually the AI bubble that did it

    • Assassassin@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      57
      arrow-down
      1
      ·
      2 days ago

      This is why I’m still running ddr4. Every time I think about upgrading a generation, there’s a run on some integral component.

      • Dultas@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        DDR4 is expensive as shit too now. I was trying to build out a new rack for my homelab and 256GB of ram went from like $300 6 months ago to $1500.

        • Assassassin@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          17
          ·
          2 days ago

          With how good my 5600x still performs, I could very well see it lasting that long. Assuming it doesn’t randomly kill itself after a few years like my previous ryzen 5.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 days ago

            I was silly and got myself a 5950X. But I feel less silly about it now tbh. It’s gonna become my new homelab core whenever I get the chance to do a new gaming build again that’s not a high 4-figure investment.

            • Assassassin@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              10
              ·
              2 days ago

              Totally worth it with how good ryzens have held up performance wise. Unless you’re doing some really CPU heavy stuff or have a beast of a GPU, you probably won’t get bottlenecked by the CPU for at least 5 more years.

              Unless you’re using windows in your homelab. I assume you’re not since you have a home lab.

          • Truscape@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            2 days ago

            In a sane world, the limitations of a CPU socket would be reached, and then newer SKUs would no longer be release and all stock for prospective builders would be second hand.

            That’s clearly not the case here. AM4 continues to get new CPU releases and parts are still available new from retail, years after the support officially ending. That’s a good thing for variety and entry level machines, but such dependency means a future CPU could be limited in featureset/performance if it releases on AM4 instead of AM5, which there may be enough demand to force designers to downgrade chips for AM4 compatibility.

            • Trainguyrom@reddthat.com
              link
              fedilink
              English
              arrow-up
              4
              ·
              2 days ago

              The good thing about new AM4 boards being available at this point in time is you have options to keep older hardware running. Usually the CPU and memory will out-survive motherboard. Much like those new Chinese motherboards supporting 4th and 6th gen Intel CPUs, this is great for longevity and reduces how much production is needed

              In a sane world, the limitations of a CPU socket would be reached, and then newer SKUs would no longer be released

              I’d argue that it would be best if computers were more like cars, a new platform gets released each decade or so, and small improvements are made to individual parts but the parts are largely interchangable within the platform and produced for a decade or two before production is retired. More interchangable parts, slower release cycle and more opportunities for repair instead of replacement

      • Wildmimic@anarchist.nexus
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I dki so too - just upgraded my X2600 with a shiny X5950, the nicest cpu my aging mainboard can run. with 16 cores and 64 gigs of ram i see a future when i simply replace the entire machine for daily use and make this one a very nice server.

    • Em Adespoton@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      It’s why I started treating computers as commodities — I rarely upgrade anymore; just wait the 5 years and by an entirely new system.

        • Pope-King Joe@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 days ago

          This is about my upgrade cadence, except for storage. I ran my Ryzen 1600 until the 7000 series dropped and upgraded mobo+RAM at once for about $600.

          I then moved the old parts to another case to use as a low load server only for both the motherboard and CPU die within a few weeks. 🫡

        • gandalf_der_12te@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 days ago

          Yeah, i think the correct response to planned obsolescence from the side of computer manufacturers is to exclusively buy products from companies who have produced long-living machines in the past.

          That gives manufacturers an incentive to make the machines they produce last longer, instead of shorter to sell newer products more frequently.

    • mack@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 days ago

      because we’re in an era where there always will be a gold rush for a specific component. upgrades have slowed down considerably in the past 10 years, my laptop is 4 years old and still kicks like the first day, I still game on my 8 year old laptop which is permanently attached to the TV and running as a steam machine with more than decent performance.

      this wasn’t even thinkable in the 00’s

      I’m pretty sure after hard disks, GPUs, rams the next shortage is either Arm CPUs or a specific future type of PSUs

    • Hubi@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      I feel like the luckiest person because I built my last PC right before the crypto hype and my current one right before the AI bubble.