• GaMEChld@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people’s minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I’d be sad to see AMD and Intel quit the dGPU space, but I wouldn’t be surprised.

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Microsoft.

        Microsoft is buying them for AI.

        From what I understand, chatGPT is running on azure servers.

        • 9488fcea02a9@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          2 days ago

          GPU hasnt been profitable to mine for many years now.

          People just keep parroting anti-crypto talking points for years without actually knowing what’a going on

          To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

          • D06M4@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Most people are buying Nvidia because that’s what’s commonly recommended on reviews. “Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!” Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands’ selling points aren’t well known and the general notion is that if it’s not Nvidia it sucks.

          • Tinidril@midwest.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            2 days ago

            Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It’s true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.

            You are right that crypto doesn’t drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.

            • Taldan@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              edit-2
              2 days ago

              Profitability of Bitcoin mining is dependent on the value of Bitcoin

              No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent

              ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum

              A massive Bitcoin spike would not affect the GPU market in any appreciable way

              Crypto mining is pretty dumb, but misinformation helps no one

              • Tinidril@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 day ago

                ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.

                You are wrong about the market value of Bitcoin’s impact on the profitability of Bitcoin mining.

                https://www.investopedia.com/articles/forex/051115/bitcoin-mining-still-profitable.asp

                Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn’t change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        2 days ago

        Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

        4x3090 or 3060 homelabs are the standard, heh.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Who the fuck buys a consumer GPU for AI?

          If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            Who the fuck buys a consumer GPU for AI?

            Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

            I can (just barely) run GLM-4.5 on a single 3090 desktop.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              20 hours ago

              … Yeah, for yourself.

              I’m referring to anyone running an LLM for commercial purposes.

              Y’know, 80% of Nvidia’s business?

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                18 hours ago

                I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

                I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

                • MystikIncarnate@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  12 hours ago

                  The original post is about Nvidia’s domination of discrete GPUs, not consumer GPUs.

                  So I’m not limiting myself to people running an LLM on their personal desktop.

                  That’s what I was trying to get across.

                  And it’s right on point for the original material.

                  • brucethemoose@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    12 hours ago

                    I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            3
            ·
            2 days ago

            Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                2 days ago

                It mentions desktop GPUs, which are not part of this market cap survey.

                Basically I don’t see what the server market has to do with desktop dGPU market share. Why did you bring that up?

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      3 days ago

      Nvidia is the only real option for AI work. Before Trump lifted the really restrictive ban on GPUs to china they had to smuggle in GPUs from the US, and if you’re Joe Schmo the only GPUs you can really buy are gaming ones. That’s why the 5090 has been selling so well despite it being 2k and not all that much better than the 4090 in gaming.

      Also AMD has no high end GPUs, and Intel barely has a mid range GPU.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        2 days ago

        To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            18 hours ago

            Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

            But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

            When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


            AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one’s buying them for CAD. No one’s buying them for compute, for sure… And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

            In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you’re allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

            …Yet they don’t.

            It makes no sense. They’re maintaining an anticompetitive VRAM ‘cartel’ with Nvidia instead of trying to compete.

            Intel has more of an excuse here, as they literally don’t manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.

        • Diplomjodler@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          My theory is that they’re just scared to annoy Nvidia too much. If they priced their GPUs so as to really increase their market share, Nvidia might retaliate. And Nvidia definitely has the deeper pockets. AMD has no chance to win a price war.

            • Holytimes@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              It’s fear of failure not success because success isn’t an option.

              Cause if they start to “succeed” then they actually fail since they will be crushed by Nvidia.

              Their options are to either hold the status quo or lose more because they angered the green hulk in the room

              • ganryuu@lemmy.ca
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 days ago

                Wait wait wait… If I push your theory a bit, it then means that Nvidia could crush AMD at any time, becoming a full fledged monopoly (and being able to rake in much more profits), but they are… Deciding not to? Out of the goodness in their hearts maybe?

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          2 days ago

          That article is a year old and is missing the latest generation of cards. Neither AMD nor Nvidia produce those GPUs anymore. AMDs best GPU from their 9000 series competes with Nvidias 5070/5070ti. The 5090 and 5080 are unmatched.

          • BCsven@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            2 days ago

            Kind of my point, these were high end and still usable by 95% of people. Everyone is chasing 1% gains for twice the price. I have an new RTX via work equipment for rendering, I play games on the side but that RTX dosnt really make the gameplay that much better. It looks great with the shine on metal, or water reflections, but when totally immersed in game play that stuff is wasted

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              edit-2
              2 days ago

              Honestly stuff like Unreal’s Lumen or Crytek’s SVOGI has obsoleted RTX. It looks freaking incredible, and runs fast, and you can put the rendering budget to literally anything else; who in their right mind would develop RTX over that?

      • Marthirial@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 day ago

        At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.

        AMD comes up with a better version of CUDA and you have the disruption needed to compete.

        • MangoPenguin@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          20 hours ago

          I’m not sure that would even help that much, since tools out there already support CUDA, and even if AMD had a better version it would still require everyone to update apps to support it.

    • lemonySplit@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      2 days ago

      Meanwhile framework’s new AMD offering has nvidia slop in it. Just why. We want AMD. Give us AMD.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        They did. There’s just no new amd mobile gpus. Like I think they only have 100 watt tdp or Max cooling to work with and the 7700S is the fastest amd mobile gpu currently.

        If amd makes a new mobile gpu, framework will probably make it into a module.

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      3 days ago

      They need dlss otherwise the triple a games they love so much wont reach 30fps!!

    • SoftestSapphic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      edit-2
      2 days ago

      I will never get another AMD card after my first one just sucked ass and didn’t ever work right.

      I wanted to try a Intel card but I wasn’t even sure if I could find linux drivers for it because they weren’t on the site for download and I couldn’t find anything specifying if their newer cards even worked on linux.

      So yeah, Nvidia is the only viable company for me to buy a graphics card from

      • ganryuu@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        2 days ago

        That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?

        On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).

        • SoftestSapphic@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          edit-2
          2 days ago

          That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience?

          Absolutely, if a company I am trying for the first time gives me a bad experience, I will not go back. That’s me giving them a chance, and AMD fucked up that chance and I couldn’t even get a refund for like a $200 card. Choosing to try a different option resulted in me wasting time and money, and it pushed back my rig working for half a year until i could afford a working card again which really pissed me off.

          I didn’t know that about intel cards, I’ll have to try one for my next upgrade if I can find on their site that they are supported.

        • njm1314@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 day ago

          What else would a consumer base things on except their own experiences? Not like it’s a rare story either.

          • ganryuu@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            21 hours ago

            I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.

            By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.

            Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing

            • njm1314@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              21 hours ago

              A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.

              • ganryuu@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                19 hours ago

                Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.

                Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.

                • njm1314@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  18 hours ago

                  Sure buddy, we’re all idiots for not liking the product you simp for. Got it.

                  • ganryuu@lemmy.ca
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    18 hours ago

                    Nice. Did not answer anything, did not point out where I’m simping, or being a fanboy. I’m not pro Nvidia, nor AMD, nor anything (rather than that I’m pretty anticonsumerism actually, not that you care).

                    You’re being extremely transparent in your bad faith.

    • darkkite@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      2 days ago

      I do local ai stuff and i get more support with nvidia cuda, and you usually get exclusive gaming features first on nvidia like dlss, rtx, and voice

      I wish they shipped with more vram though