• RipLemmDotEE@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    6 minutes ago

    My framerates have never been better since I went full AMD. My friend with a 5080 complains about low framerates on almost every new game, while I’m at max framerate.

    Don’t let the door hit ya on the way out Nvidia.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    37 minutes ago
    1. Nvidia abandons x86 desktop gamers
    2. The only hardware that gamers own are ARM handhelds
    3. Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking
    4. AI bubble pops
    5. Nvidia tries to regain x86 desktop gamers
    6. Gamers are almost entirely on ARM
    7. Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
  • DaddleDew@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    2 hours ago

    With China working hard to catch up with chip production, it is only a matter of time before we start seeing attractively priced Chinese made GPUs on the market. No idea how long it will take though.

    • 9488fcea02a9@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 hours ago

      What makes you think chinese firms wont also jump on the AI bandwagon?

      someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads.

      If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.

      • Jhex@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        47 minutes ago

        What makes you think chinese firms wont also jump on the AI bandwagon?

        the bubble won’t last that long

      • otacon239@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 hour ago

        You’ve got a good point. I wouldn’t be surprised if nVidia was working on a dedicated platform for AI to cover this exact issue. Then again, I would be equally unsurprised if they just didn’t care and didn’t mind gutting the home gaming market for short-term profit.

  • mel ♀@jlai.lu
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    4
    ·
    3 hours ago

    I’d rather pay for chinese gpus than cloud gaming. The article, by focusing solely almost considers than nothing else exists, especially in China where moore threads gpu start to have reasonnable perf for gaming. I don’t think Europe can’t produce something as long as it stays neoliberal, but some weird stuff could happen with RISCV

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      3 hours ago

      Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update

      It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

      Alas, for whatever reason they aren’t even picking existing alternatives to Nvidia, so Nvidia suddenly becoming unavailable would be a huge event.

      • Eldritch@piefed.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 minutes ago

        For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it’s like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that’s a much more recent thing. If all you want to do is game, sure that’s not a big issue.

        But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I’m looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don’t fuck it up for themselves.

      • alk@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        2 hours ago

        It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

        Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I’m pissed off they didn’t try to one-up themselves in the high-end market. I’m not buying a new Nvidia card but I’m not buying a 9000 series either because it feels like I’m paying for a sub-par GPU compared to what they’re capable of.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 hour ago

          Well, same for me TBH. I’d buy a huge Battlemage card before you could blink, but the current cards aren’t really faster than an ancient 3090.

          But most people I see online want something more affordable, right in the 9000 series or Arc range, not a 384-bit GPU.

    • VitoRobles@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      21 minutes ago

      Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving for a fad market?

      “Bye bitch.”

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 hours ago

    AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.

  • verdi@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 hour ago

    Good riddance, may the bubble burst and all that IP be available from a lincenser that charges low license fees, or even free!