Well I am shocked, SHOCKED I say! Well, not that shocked.

  • Deflated0ne@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    29 minutes ago

    Ah capitalism…

    Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 minutes ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 minutes ago

    I am just using my GTX 1650 4GB VRAM (GDDR6) and it works fine for most of the things i do I can use Linux + FSR Hack to squeeze framerates out of games that perform poorly
    and it runs my SCP:SL and tf2 fine
    SCP:SL am using FSR HACK To squeeze more framerate until Nvidia fixes VKD3D

  • nikki@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    21 minutes ago

    the most i use my gpu for at this point is minecraft shaders, i dont plan on upgrading in 10+ years

  • Evotech@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    24 minutes ago

    When a new gpu was 500-900 usd it was fine.

    But yeah, 2070rtx keeps chugging on

  • RadioFreeArabia@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    26 minutes ago

    I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 hour ago

    I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.

    • MisterCD@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      47 minutes ago

      You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.

      After some research I decided to side-grade to the 6800 and it’s a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn’t high end but it’s no slouch either. AMD is king on Linux.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    51 minutes ago

    I was gifted a 2080Ti about a year or so ago and I have no intention on upgrading anytime soon. The former owner of my card is a friend who had it in their primary gaming rig, back when SLI wasn’t dead, he had two.

    So when he built a new main rig with a single 4090 a few years back he gifted me one and the other one he left in his old system and started using that as a spare/guest computer for having impromptu LANs. It’s still a decent system, so I don’t blame him.

    In any case, that upgraded my primary computer from a 1060 3G… So it was a welcome change to have sufficient video memory again.

    The cards keep getting more and more power hungry and I don’t see any benefit in upgrading… Not that I can afford it… I haven’t been in school for a long time, and lately, I barely have time to enjoy YouTube videos, nevermind a full assed game. I literally have to walk away from a game for so long between sessions that I forget the controls. So either I can beat the game in one sitting, or the controls are similar enough to the defaults I’m used to (left click to fire, right click to ADS, WASD for movement, ctrl or C for crouch, space to jump, E to interact, F for flashlight, etc etc…); that way I don’t really need to relearn anything.

    This is a big reason why I haven’t finished some titles that I really wanted to, like TLoU, or Doom Eternal… Too many buttons to remember. It’s especially bad with doom, since if you don’t remember how, and when to use your specials, you’ll run out of life, armor, ammo, etc pretty fast. Remembering which special gives what and how to trigger it… Uhhh … Is it this button? Gets slaughtered by an imp … Okay, not that button. Reload let’s try this… Killed by the same imp not that either… Hmmm. Goes and looks at the key mapping ohhhhhh. Okay. Reload I got it this time… Dies anyways due to other reasons

    Whelp. Quit maybe later.

  • Demognomicon@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 hours ago

    I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

    • Critical_Thinker@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      2 hours ago

      I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

      The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

      As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 hour ago

        One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

        I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).

        I feel like another is… marketing?

        Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Technically Intel is also releasing some cheapo GPUs in similar capability to nVidia but they all have the same manufacturers anyways.

    • boonhet@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      fake frames

      And that’s my main problem with what the industry has become. Nvidia always had sizable jumps generation to generation, in raw performance. They STILL get better raw performance, but now it’s nowhere near impressive enough and they have to add their fake frame technologies into their graphs. Don’t get me wrong, they always had questionable marketing tactics, but now it’s getting even worse.

      No idea when I’m replacing my 3060ti, but it won’t be nVidia.

  • arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 hours ago

    Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.

    My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.

    • DefederateLemmyMl@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      bitcoin mining

      That’s a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it’s nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.

      AI is driving up prices or rather, it’s reducing availability, which then translates into higher prices.

      Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They’ll sell less volume but they’ll make more profit per unit.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 hours ago

      My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.

      Then again I have become old and grumpy, playing old games.

      • WhatYouNeed@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 hours ago

        Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?

  • GaMEChld@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 hours ago

    Don’t think I’ll be moving on from my 7900XTX for a long while. Quite pleased with it.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    14
    ·
    12 hours ago

    Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.

    • 46_and_2@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.