2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.

  • 𝒍𝒆𝒎𝒂𝒏𝒏@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    103
    arrow-down
    2
    ·
    10 months ago

    Hands up if you/someone you know purchased a Steam Deck or other computer handheld, instead of upgrading their GPU 🙋‍♂️

    To be honest I stopped following PC hardware altogether because things were so stagnant outside of Intel’s alder lake and the new x86 P/E cores. GPUs that would give me a noticeable performance uplift from my 1060 aren’t really at appealing prices outside the US either IMO

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      1
      ·
      10 months ago

      It’s diminishing returns.

      We need a giant leap forward to show a noticeable effect now.

      Like, if a cars top speed was 10mph, a 5 mph increase is fucking huge.

      But getting a supercar to top off at 255 instead of 250, just isn’t a huge deal. And you wouldn’t notice unless you were testing it.

      So even if they keep increasing power at a steady rate, the end user is going to notice it less and less everytime.

      • ugjka@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        10 months ago

        Money is in the AI chips for datacenters, i think regular consumers will be more more only getting dinner’s leftovers

    • Uninvited Guest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      ·
      10 months ago

      From 2020 I planned on building a new gaming PC. Bought an ITX case and followed hardware releases closely… And then got disillusioned with it all.

      Picked up a Steam Deck in August of 2022 and couldn’t be happier with it. The ITX case is collecting dust.

      • theangryseal@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        10 months ago

        I game exclusively on my Steam deck these days.

        I absolutely love it. I dock it and use the desktop as my standard pc too. It does everything I need it to do.

        • bnjmn@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Same here! I was worried I wouldn’t use it (I haven’t been gaming on PC much) but I actually game on it much more than on PC

    • daq@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      10 months ago

      I’m surprised so many people are cross shopping tbh. I briefly considered steam deck, but specs are barely enough to play at 1080p so it’s completely useless when docked and a purely portable device with a tiny screen and gamepad carries very little value to me personally.

      I ended up getting eGPU enclosure for my laptop and grabbing a 1080ti from a friend that didn’t need it anymore. I’m able to play D4 at 4k on medium settings.

      Even if I had to buy a gpu like I was originally planning, ~$800 total to play in 4k on a 43" screen with a mouse and keyboard is a completely different experience from anything Xbox or steam deck offer.

  • that guy@lemmy.world
    link
    fedilink
    English
    arrow-up
    70
    arrow-down
    2
    ·
    10 months ago

    You guys think I should upgrade my Voodoo 3 card? No one is joining my quake server anymore anyway

  • just_change_it@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    10 months ago

    Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We’ve been in a rut since 2020… and arguably since the 2018 crypto explosion.

    • Vash63@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.

        • Throw a Foxtrot@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          10 months ago

          No, it’s a datum - about how people feel

          Performance numbers are easy to find. The prices have not been great and the 4060 is held back by its reduced memory speed, but it’s a performance increase nevertheless. The flagship product, the one that shows what is currently possible in terms of GPU power, did show remarkable improvement in top performance.

          I’m more salty about AMD not supporting ai workloads on their consumer gpus. Yes, ROCm exists and it will work on quite a few cards, but officially it’s not supported. This is a major reason why Nvidia is still the only serious player in town.

          • fruitycoder@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Yeah AMD just seems like it just doesn’t want to market AI on consumer hardware for devs. They have a ryzen chip line with built in dedicated "NPU"s now, but honestly the fact there is a disconnect between AI for the GPUs and a focus on windows, even for development, just makes it feel clunky.

            • Throw a Foxtrot@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              3
              ·
              10 months ago

              Ok I thought it common knowledge but maybe I should specify.

              Datum is the singular form of data. Data is a collection of many single datums. If you have ten thousand anecdotes they do in fact become statistically significant.

  • trag468@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    10 months ago

    Still rocking a 1080. I don’t see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I’m finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      I just “upgraded” from a GTX 1080 to an RTX 4060 Ti 16Gb, but only because I was building a PC for my boyfriend and gave him the 1080. I’m really not seeing a noticeable difference in frame rate on 1440p.

    • ATDA@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Yeah I keep waiting for a good deal to retire my 1080ti.

      Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.

  • HeyJoe@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    10 months ago

    As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      I’m so glad that Intel has stepped into the GPU space, even if their cards are weaker. More competition will hopefully light a fire under NVidia to get their shit together.

    • CalcProgrammer1@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      10 months ago

      I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.

  • aluminium@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    10 months ago

    I finally upgraded my GTX970 to a used RTX 3080 for 300€. The difference at least for me for the same 300€ was insane.

  • DrPop@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    10 months ago

    I just don’t see the point in upgrading every new release anyway, or even buying the most expensive one. I’ve had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven’t tasted 120 fps but I’m just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife’s PC, maybe then I’ll see what’s going on with the higher end ones. Maybe I’m just a broke ass though.

    • cyberpunk007@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      Ya the problem I landed in was not anticipating how hard it would be to push my new monitor. Ultra wide 2.5k resolution with 144Hz. I can’t do cyberpunk full res more than 60fps, and that’s with dlss enabled and not all settings at max.

      2070s

  • Paddzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    10 months ago

    I had to buy 3070 ti at scalped price. Ended up paying £700 for it. I hate myself for it but the prices didn’t shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife’s 1080 is going for now, maybe we’ll get 5080 if it’s not a rip off.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    10 months ago

    So how about the 2½ years from 2016 to 2018 between Nvidia GFX 1080ti and RTX 2080?
    I think the headline should say A Year not THE year.

    • weeeeum@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      To be honest I think it’s just AI developers gobbling them all up because Nvidia’s dedicated workload and professional GPUs are always sold out. Plus spending 1400$ on games is ridiculous, and that’s coming from somebody with a ryzen 7800x3d and a 7900xtx. I regret it so much, such a waste of money.

      • nexusband@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Having a 7900XTX and a 5800X…I don’t really get the wate of money part. I can throw everything at it and it runs exceptionally well with 5120x1440 resolution. Most, if not all,is running well inside Freesync 2 range…I couldn’t be any happier and since I’m getting old now, I’d compare it to the Athlon 64 X2 times with a Radeon 850 XT…between that and now, I never had a system that did so well with the games of it’s time.

        Edit: Oh you mean spending 1400 on games…well, yeah, games are ridiculously priced…considering you don’t really own a copy either…

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    This is the best summary I could come up with:


    The performance gains were small, and a drop from 12GB to 8GB of RAM isn’t the direction we prefer to see things move, but it was still a slightly faster and more efficient card at around the same price.

    In all, 2023 wasn’t the worst time to buy a $300 GPU; that dubious honor belongs to the depths of 2021, when you’d be lucky to snag a GTX 1650 for that price.

    But these numbers were only possible in games that supported these GPUs’ newest software gimmick, DLSS Frame Generation (FG).

    The technology is impressive when it works, and it’s been successful enough to spawn hardware-agnostic imitators like the AMD-backed FSR 3 and an alternate implementation from Intel that’s still in early stages.

    And DLSS FG also adds a bit of latency, though this can be offset with latency-reducing technologies like Nvidia Reflex.

    But to put it front-and-center in comparisons with previous-generation graphics cards is, at best, painting an overly rosy picture of what upgraders can actually expect.


    The original article contains 787 words, the summary contains 168 words. Saved 79%. I’m a bot and I’m open source!

  • AlpacaChariot@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    What’s everyone’s recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (£350-400 instead of £180). I’d mostly be using it to play games like the remastered Rome Total War.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      6600XTs seem to be going for around £200, often £180 even (used, eBay).

      If you’d prefer new, you can get a 6650XT for £240. A 6650XT will be 6% faster than a 6600XT.

      It’s double the performance of a 580, uses less power, will be supported longer, etc.

    • bazsy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      There are some used options e.g. 5700 XT-s are really cheap because many of them were mining card. For new cards there aren’t many options RX 6600 has relatively good value, but it’s only worth it if efficiency or features like hw video codecs are important for you.

      • AlpacaChariot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Is there any issue with buying a card that was previously used for mining?

        When you say RX 6600 do you mean that one specifically or the range including 6600XT etc? I don’t have a good handle on what the real world differences between the variants are.

        • Hitchie_Rawtin@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          Is there any issue with buying a card that was previously used for mining?

          If used by a home user who didn’t know what they were doing they might have run it hotter for much longer than a typical gamer so the thermal paste might need a redo.

          If used by some miner doing it even quasi-professionally or as a side-gig I’d much prefer it over a 2nd hand card from any typical gamer (most miners) they’ve kept the voltage/temps low and taken care of it far better than a gamer who might be power cycling regularly and definitely thermal cycling even more regularly.

        • bazsy@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          No, there isn’t any more risk buying a mining card than any other used card. In both cases you should use a platform/marketplace with buyer protection options. Maybe one additional step is checking the VBIOS when testing.

          The non XT is the best value of the 6600 family but depending on local pricing the 6600XT, 6650XT and even the 7600 could make sense. Just keep in mind that these are the same performance class. Some charts show the mentioned GPUs.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Been waiting for a good deal to replace my rx480 in my sister’s rig. I think they announced rx400/500/vega GPUs will only get security driver updates now and only for a while, I assume that applies to Linux too. RX580 will play many games at 1080p 60fpd but not the modern demanding ones (maybe not even at low settings).

  • twinnie@feddit.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I just upgraded from a 970GTX to a 2060S I bought on AliExpress. Bargain.