• MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Microsoft.

      Microsoft is buying them for AI.

      From what I understand, chatGPT is running on azure servers.

      • 9488fcea02a9@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        2 days ago

        GPU hasnt been profitable to mine for many years now.

        People just keep parroting anti-crypto talking points for years without actually knowing what’a going on

        To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

        • D06M4@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Most people are buying Nvidia because that’s what’s commonly recommended on reviews. “Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!” Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands’ selling points aren’t well known and the general notion is that if it’s not Nvidia it sucks.

        • Tinidril@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          2 days ago

          Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It’s true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.

          You are right that crypto doesn’t drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.

          • Taldan@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            2 days ago

            Profitability of Bitcoin mining is dependent on the value of Bitcoin

            No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent

            ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum

            A massive Bitcoin spike would not affect the GPU market in any appreciable way

            Crypto mining is pretty dumb, but misinformation helps no one

            • Tinidril@midwest.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.

              You are wrong about the market value of Bitcoin’s impact on the profitability of Bitcoin mining.

              https://www.investopedia.com/articles/forex/051115/bitcoin-mining-still-profitable.asp

              Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn’t change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      2 days ago

      Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

      4x3090 or 3060 homelabs are the standard, heh.

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Who the fuck buys a consumer GPU for AI?

        If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          Who the fuck buys a consumer GPU for AI?

          Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

          I can (just barely) run GLM-4.5 on a single 3090 desktop.

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            21 hours ago

            … Yeah, for yourself.

            I’m referring to anyone running an LLM for commercial purposes.

            Y’know, 80% of Nvidia’s business?

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              18 hours ago

              I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

              I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

              • MystikIncarnate@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                12 hours ago

                The original post is about Nvidia’s domination of discrete GPUs, not consumer GPUs.

                So I’m not limiting myself to people running an LLM on their personal desktop.

                That’s what I was trying to get across.

                And it’s right on point for the original material.

                • brucethemoose@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  12 hours ago

                  I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.

                  • MystikIncarnate@lemmy.ca
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 hours ago

                    I wouldn’t be surprised in the slightest if they are included in the list. I dunno, I’m not the statistician who crunched the numbers here. I didn’t collect the data, and that source material is not available for me to examine.

                    What I can say is that the article defines “discrete” GPUs instead of just “GPUs” to eliminate all the iGPUs. Because Intel dominates that space with AMD, but it’s hard to make an iGPU when you don’t make CPUs, and the two largest CPU manufacturers make their own iGPUs.

                    The overall landscape of the GPU market is very different than what this data implies.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          3
          ·
          2 days ago

          Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              edit-2
              2 days ago

              It mentions desktop GPUs, which are not part of this market cap survey.

              Basically I don’t see what the server market has to do with desktop dGPU market share. Why did you bring that up?