This is getting out of hand. The other day I saw the requirements for “Hell is Us” game and it’s ridiculous. My RX6600 can’t play anything anymore. I’ve downloaded several PS3 ROMs and now I’m playing old games. So much better than this insanity. This is probably what I’m going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don’t want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They’re $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can’t afford these high end GPUs, but now very few games work on low settings and I’d get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an “omfg, wtf is this horrible shit” moment.
I’m so sick of this shit!
I don’t regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I’m emulating and it’s actually pretty awesome. I’ve missed out on so many games in my youth so now I’m just going to catch up on what I’ve missed out on. Everything works in 4k now and I’m getting my full 60FPS and I’m having so much fun.
You want to play at 4k on an old low end card. I’m sorry but this one is on you.
Have you tried just setting the resolution to 1920x1080 or are you literally trying to run AAA games at 4K on a card that was targeting 1080p when it was released, 4 and a half years ago?
I didn’t mention that in the op, but Indiana Jones was running like shit on 1080 low settings. The fucking game forces DLSS. This is where gaming heading, forced DLSS and forced garbage so we are forced to buy expensive shit
That’s just on Indiana Jones and bad optimization. There are still plenty of other, newer games that should run perfectly fine. Of course the big, chunky games that are marketed as “Look at how graphically intense this is! Look at the ray tracing!” are going to run poorly.
Though I will absolutely agree that a lot of studios are throwing optimization out the window when developing new games, just relying on the latest hardware to power through it.
Yup, and the new hardware is now skyrocketing in prices. Nvidia and AMD still make 8GB cards and no new game can run on this shit anymore.
Well yes, Ray Tracing is required. A 6600 XT is a mid rage card several generations ago that has very early generation RT support. Not even the 7000 series does it super well.
Trying to expect anything good from something ray traced with a 6600 xt is a pipe dream. Let alone 4k. Minimum requirements on steam usually are 1080p 30fps (sometimes 720p/30fps WITH upscaling DLSS/FSR)
Yes gpus are expensive, but it doesnt mean you should expect those kinds of titles to run on your hardware.
You havent mentioned what CPU you have, as running 4k also increases that demand a considerable amount, especially with 2x 4k monitors.
If you want a gpu upgrade the Arc battlemage, and soon Arc Celestial GPUs should be affordable. Battlemage is coming down to 300 bucks new rn, so those should be able to handle newer titles better, but from a second hand perspective, DONT expect to be able to play RT required games, let alone, WITHOUT upscaling.
I hate the peices as much as anyone else, but the only option we have is to wait for better second hand cards over time, just stick to Indle/AA games (not AAA or “AAAA”), and possibly run games at 1080p instead of 4k.
Best of luck getting everything running!
I tend to play shit 2+ years old…
Ditto. Whenever it goes on sale for 75% off 🤣
With GOTY and all the “expansions”
Love it when you can get those for a cheap price. I ended up getting a copy of Borderlands 1 with DLC on another disc that’s one of those advertised for xbox one and 360 sets for a good price a few years back. Don’t remember the price, but it had to have been less than $20 before tax.
This has been me since the dawn of time. I have never bought a game on release date. My son can vouch for me on this one. Whenever he asked for a PS4 game that just came out, my answer is “we need to wait until it becomes $15 - $20 at gamestop and we will get it”. I’ve never bought a PS4 game for more than $20.
I stopped at ps3/Xbox360 gen.
Back then I’d hit GameStop hard on the “buy two get one free” used days.
Uses to buy dozens of titles. Haha.
Man, those were the days. I have over 50 CDs for the PS4/PS3 until now.
Ditto. Went as far as printung box art for the generic GameStop covers
It’s not just insane power requirements. It’s putting graphics bling ahead of playability and fun.
I recently bought Doom 2016 and Mosa Lina. I’ve had more fun with the latter than the former, even if I’ve been a Doom and Doom II player all my life.
Ita fucking insanity. I got a great deal on 2 4k monitors from Facebook marketplace (I’ve edited the post for more context) and now I can’t play anything. :/
Surely you’re not trying to use a card that targets 1080p gaming for 4k.
That’s all I can afford. And there was no way I’d pass those monitors. Everything is good when emulating, though. This post was more of a venting/ranting post.
I mean you probably are just going to have to run the games at 1080p instead of 4k until you can afford a better gpu
Some games run like shit even on 1080p. Lmao. But I get ya
You can always just run the monitors at 1080p when playing games. Perfect integer scaling doesn’t look bad if the display isn’t massive.
Sounds like you don’t understand how demanding it is for a graphics card to run at 4K. Like trying booting up the same game 4 times at 1080p. You expect your graphics card to be able to handle that?
That’s certainly fair, however it’s unrealistic to expect a card to perform the same while doing +4x the work, all else being equal.
Have you tried playing at non-native resolutions?
for me many old games are just that, old. some have still great modern gameplay and just as good music as back then. if the artstyle is nice then i play them and dont think they are ugly.
some games i really think are fucking ugly like the when the textures are verw low blurry and low resolution.
and these 2d games or games with sprites (how to call them? like age of empires 1 or 2. for me they dont age, its their artstyle and if you make it 3d then its not nive anymore.
what game i cant play are these, that are 3d but something makes it that the picture is all so flat and confusing. metal gear solid 3, as an example.
Idk man, I bought Sol Cesto yesterday, and I’m pretty sure my toaster could run it
Edit:
RX 6600, two 4k monitors
Bruh. I have a 3080 Ti and barely feel comfortable running my games in 2k. I’m pretty sure the 6600 was made with only 1080p and lower in mind.
I dunno my 3080Ti runs 4k 90+ FPS no problem. Maybe not all games but most decently modern games.
😂. I know, dude. That’s my whole point. Why do WE have to bear the burden of optimizing the game? Why don’t the developers optimize their games? It’s bonkers the game “hell is us” listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You’re saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they’ll run well on your boutique monitor.
It’s a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I’ll keep running new games on my older card at 1080 and be perfectly happy with it.
(And then you have portable boxes that somehow ADvertise to run games in 4k 60fps for 499$ 🤣🤣)
That’s why I went back to the roots. I’m now playing older games at 4k 60 fps no problem. I’ll stick with emulators. I’d rather not spend the $700. I’ll still complain about new games not running for me, though. That’s the only thing I can do beside playing older games instead 😂
Or just run newer games at 1080p. Unless you’re unhealthily close to the monitor you probably won’t even see the difference.
If you’re rubbing it on a TV across the room, you probably literally can’t see the difference.
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can. It’s best when the screen itself is physically 1080p. I think you fat-fingered the “b” in “running”. Came out funny 😂
1080p scales to 4k perfectly unless you have a weird aspect ratio, since it can just treat each square of 4 screen pixels as 1.
What looks bad is trying to run anything between 1080p and 4k, since it’s not a perfect 4:1 relationship.
You’ll want to use Lossless Scaling. It’ll quadruple the pixels without any filtering and make the output not look weird on a 4k display.
Elaborate, please. What res would that be?
It’s not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can’t rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on “kind of”)
If you don’t want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you’ll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn’t received the new GPUs yet.
I’ll just play older games. Everything runs at 4k 60 fps no issue on RPCS3 and I’ve been having a freaking blast. Started with uncharted 1, and man, I’ve missed out on this game. I’m going to stick with older games.
Once you detach your mind from the big publishers and start looking at small indie devs, there are truly astonishing games out there that could run on potatoes.
My good old RX 480 will last me for a while longer.
thundercrack
You take the blue pill, you keep fanboying, you keep going broke, you eat the bugs, and you like it.
You take the red pill, I show you how powerful the well optimized potato can be.
Indie devs are the only ones making new games anyway. Everything else is just this year’s bells and whistles slapped onto the same game they’ve sold to the same audience annually.
That’s another part of gaming I’m going to start to look into, indie games. Thank you for the reminder.
Obligatory reminder that Titanfall 2 is pretty great, runs great (its built on a modified Portal 2 era Source branch) looks great… and with Northstar, people figured out how to mod the client to work with custom servers and mods… and it all works on linux as well, literally has its own custom Proton branch.
But uh yeah, the new AAA graphics paradigm is:
Everything is a dynamic light, game devs don’t optimize shit anymore because they’re all being slave driven by corporate…
…because everything is built for stupid high graphical realism fidelity, few AAAs make any novel or engaging, fully fleshed actual gameplay…
But hey, nobody has to bake light and shadow maps anymore!
Assuming, of course, your card can support real time raytracing, which it can’t, so we had to invent intelligent frame upscaling to replace well optimized AA methods, but that also isn’t enough, so we had to invent (fake) frame gen.
Oh and all the cards that can do this are all sitting around double MSRP, because board partners can do whatever the fuck they want, and retailers don’t even bother to attempt to stop scalpers.
…
The 9060 XT launches this week, and its supposed to MSRP at $350, for the 16 gig model.
My guess is we will get maybe 12 hours of prices between $350 and $450 (for the fancier partner models)… and then $500 to $550 will be the new ‘baseline’ price for whatever is left by next week.
If you’re planning on trying to get a 9060 XT, good fucking luck, you’re almost certainly gonna need it.
…
The GPU market is largely doing the same thing thats happened with cars, housing: everything is a luxury model, none to few viable economy options even get newly mfg’d, then the entire consumer base goes into debt to keep up their lifestyle, then all the debt bubbles pop and consumer spending ability craters… and… maybe then, 6 months to a year after that, the GPU mfg’rs could possibly start releasing actual economy models? Maybe?
…
Either way, a whole lot of AAA studios are either going to keep monetizing harder and harder… or realize that as we enter this 2nd Great Depression… that shit ain’t gonna work for a mass consumer base, people just won’t have the money.
Or I guess Klarna and Afterpay come built in to GTA 6 Online shark card purchases.
Why not?
Damn, you fucking spoke my heart. This is the whole point of my post. I won’t repeat what you said, but thank you for crafting this comment. Nailed it.
There are new games?
In all serious, the indie scene is full of new games that don’t require a new GPU to run. A Day of Maintenance and Ephipany City and Mouthwashing and Unsorted Horror and Cosmic Wheel of Sisterhood (to name the few specifically new games that I’ve played and really loved on my RX580, which is about 40% less powerful).
Thank you for mentioning these names. This is basically what I’m doing now, indie and emulation. It’s all just for fun.
Okay I’m going to go against the grain, and will probably get downvoted to hell, but this is not new. This is PC gaming. This has always been PC gaming. Hot take - you don’t need 4k@60fps to be able to have fun playing games.
New games require top of the line hardware (or hardware that doesn’t even exist yet) for high-ultra settings. Always have, always will. (Hell, we had an entire meme about ‘can it run crysis’, a game that literally could only play on low-medium on even the highest level machines for a few years) Game makers want to make their games not just work now, but want them to look great in 5 years too. Unless you have shelled out over a grand this year for the absolute latest GPU, you should not expect any new game to run on great settings.
In fact, I do keep my PC fairly bleeding edge and I can’t drive more than High settings on most games - and that’s okay. Eventually I’ll play them on Ultra, when hardware catches up. It’s fine.
And as for low to mid level hardware I was there too - and that’s just PC gaming friend. I played Borderlands and Left4Dead the year they came out on a very old Radeon card at 640x480 in windowed mode, medium settings, at about 40fps.
Again, this is just what PC gaming is. If you want crisp ultra graphics, you’re gonna have to shell out the ultra payments. Otherwise, fine tuning low to medium payments, becoming okay with sub 60fps, this is all fairly normal.
Personally, when I upgrade I find great joy in going back and “rediscovering” some of the older games and playing them on ultra for the first time.
And honestly, most modern games still look great on medium or low if you just put the textures on high. And that usually only affects VRAM usage and not performance.
I usually buy my PCs with an expected lifetime of about 10 years. And I don’t even buy the highest level components. Just enough to get High graphics at whatever is currently the most common resolution. After those ten years they usually can still play the newest releases at low settings while still looking better than ten year old games. You just have to play around with the settings a bit.
My Steam Deck is the only gaming PC I have that is actually struggling having new releases look good. But that was to be expected. And most of them still work if I tolerate abysmally low resolutions at 25 to 30 fps.
I disagree and I’m too lazy to explain but in short, it’s not about High/Ultra settings, that’s just name for the settings. It’s about how the games look, play, etc… vs how they perform. And I don’t remember that PC gaming was ever before so bad even when we’ve got shitty console ports.
You were so lazy it sounds like you agree with the og comment, not disagree. In essence, I think your comment aligns with the one you are replying to.
(Calling u so lazy is a light hearted joke that joins into my point)
4:30AM 2.6.2025 BC (before coffee) morning me wrote that comment.
I honestly just glanced at and dreamt the comment I was replying to.
Hey man, I agree with you on principle but the fact is that you’re trying to run new AAA games with an older card at 4K.
Time marches on, and graphics demands have changed. Newer cards are built differently and games are (albeit poorly) designed to utilize the new hardware.
6600 is a fine card but yeah, you’re going to have to compromise somewhere. A lot of good advice here to tap into older games, or you can spend $180 and buy a good 1440p monitor and see if that opens up your options as well.
You’re hermit crabbing into used parts on the cheap which is great, but if you’re not willing to pay a pound of flesh for a new card then you’re going to have to settle for reduced performance - it’s that simple. Otherwise what’s the point of making better hardware, if nothing takes advantage of it?
I agree with you. That’s why I moved to emulation and playing old games. I don’t want to stress my budget for something I don’t need. I’ve been emulating PS3 games and they’ve been running fantastic. Cheers
Hell is Us requires a 5600 XT to play (from looking at its minimum requirements on Steam), you have a 6600 so you should be good (obviously not at 4k though).
Also seeing the whole 4k thing, the monitors might not have been a mistake, but they are probably an investment for the future at this point when budget friendly 4k cards are a thing, until then, you’ll have to rely on upscaling from lower resolutions (if thats possible with tools built into the game) or just playing in a non-native resolution (1080p) for new games.
hell is us is an extreme outlier when it comes to performance. and the rx 6600 isn’t meant to run games at 4k period. you can probably still play most of the games on steam at 60fps+ on medium settings. high if you play at 1440p instead. yes there is a problem but I think you’re over exaggerating a bit, especially considering you are playing at 4k.
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I’m lucky my steam deck can run it or I’d be screwed.
It’s a PS5-only game running on a portable device. Considering the state of a lot of ports (including this one at launch lol), it’s a miracle that it runs this well.
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
I’m so used to having a toaster for a desktop I had to look up what FSR is. I’m too college student budgeted to touch FSR.
Last I played months ago, I recall it being fine enough when I had my deck docked. FPS was stable and didn’t look too bad. Though, I’m not an expert on anything graphics, so my word doesn’t mean much.
If you find yourself playing old games more and more, [email protected] or [email protected] might just be the communities you need to join if you haven’t already.
[email protected] is another one worth looking at. It’s for people who don’t play games at launch, but wait a few years instead :)
Hell yeah, joined all three. Thank you
Remember when oblivion came out? Or crysis? They were so hard to run that they became meme benchmarks lol
And now that “gaming” is incredibly mainstream, the push to be more and more marketable by investors by pushing technology in graphics because that’s what sells.
Graphics too hard to run or not, I just want good games. And all this priority on intense graphical fidelity doth not maketh for many resources for the rest of the game and often shows a priority on PROFIT over all else.
Not buying games, for any reason, including you can’t play them, is probably the best and healthiest thing to happen to gaming since indie gaming…
So, go, play your 15 year old games. Enjoy what’s actually fun. The world will be a better place for it.
Go back further back. It is like this ever since gaming on PC is a thing. Doom, Wing commander 3, Quake III, …
You have to go back to when gaming was dominated by the Amiga and Atari ST to find a time when it hasn’t been like this.
I’d say thanks to Indie games, it is actually much easier to have a pleasing gaming life on low specs nowadays.
God damn, I couldn’t agree more. It’s been a blast playing older games, not gonna lie. I just can’t understand why I need to spend $700 plus on a GPU ONLY to be able to play something.
Because investors realized that gaming is lucrative and leads society. You think it’s a coincidence that Valve created basically the first successful digital distribution platforms and now every entertainment medium followed their model? Do you think microprocessors need to be that good on phones for Web browsing? AI compute hardware came from gaming gpu technology. The software originally made for and tested on games.
But, like music publishers trying to push the newest and latest and greatest music on us, only to have you realize that “oh yeah, 80s music actually kicks ass” and “oh fuck, 50s music WAS neat” or if you have or when you do “holy shit classical is actually amazing and lasted for literally hundreds of years whereas current pop music is constantly only like 2 years old”…
…Like music in that way, old games don’t just magically become irrelevant and bad, despite what pop culture may try to tell us. Compatibility may be the biggest issue, but a lot of old games are legitimately better than newer ones (a lot of old games were bad too). It’s all the social and technical evolution of the medium, and once you start being able to look at it that way - once that perspective and vibe catches on a bit more - I think gaming will be healthier.
Don’t spend 700 bucks. The game will still be there. Wait. Play fun things now that you can, and if when it’s economically playable for you it’s still fun, then it’ll still be fun. There are far too many games right now to even seriously consider fomo for anything but the MOST socially important games, and those are few and far between, and usually very easy to run.
You’re doing fine.
Don’t look at how much 5090s cost in like Australia or some other countries.
Absolutely spot on. It needs a personal change. A change in mentality, the way we think of entertainment, games in this case. We need to stop believing these profit hungry people and stop chasing after those “shadows” and “highlights” and every single detail in games. When you enjoy a game, you won’t even notice all those “details”. You’re not going to be running around in a game looking at trees and sunlight. I guarantee you that it all becomes blur in the background and you won’t even care about it. It’s all a collective anxiety they’ve trained us into. Chasing after those fps numbers and details is what got us hooked to their exorbitant prices and shit performance. It was somewhat a wakeup call for me, if you wanna call it that. I just got tired of stressing about my hardware, that’s totally fine and capable, not playing their shit and poorly optimized games. Then come those who defend corporations and berate you for “choosing the wrong resolution”. Why? I like 4k. It looks nice. Why do I have to bear the burden of their poorly optimized games. They ARE 100% more than capable of optimizing their games to run on 4k ON MY RX6600, but they don’t want to. They’re lazy. It won’t make them enough money. It won’t squeeze the last penny out of our pockets. They’re also listening to Nvidia. Of course Nvidia wants us to believe that games are “so good” nowadays that we need their top of the line $3000 GPU to play them on 4k. How dare we peasants play on 4k on a non $3000 GPU? Blasphemy!!! Fuck’em. At least there are still folks like you out there who understand this bullshit and don’t bootlick.
Lol you’re cute.
Also, it’s a fascinating novel perspective you’ve presented, in that our caring about fps and small details is a learned obsessive behavior. I’d love to hear more about how you think that works and came to be.