• kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 hours ago
    1. Nvidia abandons x86 desktop gamers
    2. The only hardware that gamers own are ARM handhelds
    3. Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking
    4. AI bubble pops
    5. Nvidia tries to regain x86 desktop gamers
    6. Gamers are almost entirely on ARM
    7. Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
    • Eldritch@piefed.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 minutes ago

      Nvidia has drivers for arm. They’re not in as good a shape as the X86 one is. But I don’t think it’s that big of a roadblock.

  • etherphon@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    2 hours ago

    Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today’s GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.

  • DaddleDew@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    4 hours ago

    With China working hard to catch up with chip production, it is only a matter of time before we start seeing attractively priced Chinese made GPUs on the market. No idea how long it will take though.

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      4 hours ago

      What makes you think chinese firms wont also jump on the AI bandwagon?

      someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads.

      If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.

      • otacon239@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 hours ago

        You’ve got a good point. I wouldn’t be surprised if nVidia was working on a dedicated platform for AI to cover this exact issue. Then again, I would be equally unsurprised if they just didn’t care and didn’t mind gutting the home gaming market for short-term profit.

      • Jhex@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 hours ago

        What makes you think chinese firms wont also jump on the AI bandwagon?

        the bubble won’t last that long

        • JensSpahnpasta@feddit.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 hour ago

          The bubble might burst, but there are real use cases for AI out there and therefore you will see AI in use even after the current ponzi scheme has collapsed. You can do great voice and handwriting recognition now and speech generation and that won’t go away.

        • chaogomu@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 hour ago

          The only thing that will burst the bubble is electricity.

          The Dotcom bubble burst due to Dark Fiber, all because massive Internet backbones were easy to build, and the last mile to people’s homes, was not.

          The current electrical grid cannot support the number of data centers being built. The ones that are planned on top of that… Well dark data centers will be the new dark fiber.

          There’s more complexity to it all, but really it all boils down to power for this particular bubble.

          • Jhex@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            22 minutes ago

            or lack of use? the current trend is fueled by hype that AI can do everything and will sub 50% of the work force, another nightmare scenario… however, current AI may be an Ok tool for some jobs and not much more, dsthe world does not need 200 Gwatts of AI datacentres to produce memes

    • VitoRobles@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2 hours ago

      Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving for a fad market?

      “Bye bitch.”

      • uncouple9831@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 hour ago

        I don’t understand the hate on Nvidia. They raised their prices, people kept paying those prices. AMD has always been there, not quite as good. People who are willing to pay for the brand they want are the problem. Oh nooo I have to render at 2k@60 instead of 4k@120 how will my poor eyes survive?!?

        Just don’t buy their stuff. It’s not worth it and hasn’t been for most of a decade.

        • Eldritch@piefed.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          27 minutes ago

          That didn’t happen in a vacuum. For a lot of us we do more than game. And there legitimately wasn’t an alternative till much more recently.

          For instance, for Over a decade. If you were rendering out, a hardware accelerated video through Premiere. It was likely with an Nvidia card. Raytracing, Nvidia has been king at that since long before the 2000 series. It’s changing slowly. Thank goodness. I’m more than happy to be able to ditch Nvidia myself.

  • mel ♀@jlai.lu
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    4
    ·
    5 hours ago

    I’d rather pay for chinese gpus than cloud gaming. The article, by focusing solely almost considers than nothing else exists, especially in China where moore threads gpu start to have reasonnable perf for gaming. I don’t think Europe can’t produce something as long as it stays neoliberal, but some weird stuff could happen with RISCV

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      4 hours ago

      Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update

      It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

      Alas, for whatever reason they aren’t even picking existing alternatives to Nvidia, so Nvidia suddenly becoming unavailable would be a huge event.

      • Eldritch@piefed.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it’s like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that’s a much more recent thing. If all you want to do is game, sure that’s not a big issue.

        But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I’m looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don’t fuck it up for themselves.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 hour ago

          Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state.


          That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970. Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it.

          CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But there’s no reason (say) Plex shouldn’t support AMD, or older editing programs that use OpenGL anyway.

          • Eldritch@piefed.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            46 minutes ago

            Yes, the software will get there long before many people’s hearts and minds. As you said already is in many cases. The inertia Nvidia got by being early is why they’re so dominant now. But I think Nvidia’s focus on crypto and now data center AI is set to hurt them long term. Only time will tell, and they’re technically swimming in it ATM. But I’m getting out now.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              5 minutes ago

              CUDA is actually pretty cool, especially in the early days when there was nothing like it. And Intel/AMD attempts at alternatives have been as mixed as their corporate dysfunction.

              And Nvidia has long has a focus on other spaces, like VR, AR, dataset generation, robotics, “virtual worlds” and such. If every single LLM thing disappeared overnight in a puff of smoke, they’d be fine; a lot of their efforts would transition to other spaces.

              Not that I’m an apologist for them being total jerks, but I don’t want to act like CUDA isn’t useful, either.

      • alk@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        4 hours ago

        It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

        Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I’m pissed off they didn’t try to one-up themselves in the high-end market. I’m not buying a new Nvidia card but I’m not buying a 9000 series either because it feels like I’m paying for a sub-par GPU compared to what they’re capable of.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          3 hours ago

          Well, same for me TBH. I’d buy a huge Battlemage card before you could blink, but the current cards aren’t really faster than an ancient 3090.

          But most people I see online want something more affordable, right in the 9000 series or Arc range, not a 384-bit GPU.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    4 hours ago

    AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.

  • verdi@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    Good riddance, may the bubble burst and all that IP be available from a lincenser that charges low license fees, or even free!

  • RipLemmDotEE@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    2 hours ago

    My framerates have never been better since I went full AMD. My friend with a 5080 complains about low framerates on almost every new game, while I’m at max framerate.

    Don’t let the door hit ya on the way out Nvidia.