• frunch@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    8 days ago

    There has to be a point where we stop giving a fuck about graphic capabilities. It feels like we’ve already peaked, and any further progress in the past several years has been marginal. I mean yes, if you need ultra 4K or 8K or 16K or whatever the fuck they’re up to then you’re gonna have to drop however much coin they demand. Truth is nobody ever needed graphics on that level especially for video games. Really good games have existed despite having potato graphics, and some of the most visually-stunning games don’t have the level of popularity or fandom of some of those potato-quality-graphics games.

    When i come across articles like this, i like to think of the early video gaming days when real ingenuity was often what moved the bar. Finding ways to use chips in ways they weren’t intended to operate to make them do what the creators wanted. Without those extra layers of friction, it seems more of the attention has shifted to the graphics over the content of the game itself. There’s still plenty of good stuff out there no doubt, and graphics do matter but i feel most of the responsibility is on the developers at this point. You don’t need the most graphically capable card, and even if you bought it–aren’t there still other high-priced items you’ll also need to fully unlock it’s capabilities? Then you need games that fully take advantage of all those capabilities. This all reminds me of audiophiles in their ever-persistent hunt not only for the perfect combination of equipment, but also the music that’s been produced on such a high level as to need such sophisticated equipment to hear the differences in the first place.

    We are getting priced out of high-end home computing, in any case.