Mouse acceleration off, mouse sensitivity all the way down, mouse smoothing off, raw mouse input on, dlss off, resolution scale 100%+, anti aliasing off, invert mouse y-axis on.
FXAA (Fast Approximate) is known for looking like vaseline smeared on the screen (used because it doesn’t give much performance hit), but other types are usually effective at removing aliasing on sharp corners without adding a blurring effect.
SSAA (Super Sampling) applies to the entire screen, and in a way is sort’ve rendering the whole game at a higher resolution and scaling it down to your monitor resolution. Downside is that can bring performance down too much if you’re hardware isn’t up for it.
MSAA (Multi Sampling) focuses just on the edges of things, where it’s most likely that the jaggy aliased pixels would be most noticeable. This is much easier for your computer to do, since it’s not doing the entire screen.
Of those, only FXAA should be blurry. However, there are now Temporal AA, Like DLSS, or DLAA. These can introduce a motion-blur effect during movement.
Lastly, FSR upscales a lower resolution to a higher one, which can make things blurry if the base image is too low of a resolution and lacks enough information to guess how to upscale it.
It depends on which method of AA to large varying degrees, But they all blur, as is literally their function. For whatever reason, at 1440p, my eyes and brain overwhelmingly recognize shapes and depth better and way faster with no AA. To me, applying DLAA is like another layer of filter over what I’m seeing. My goal is to have a level of pixel density per degree of field of vision where pixels or jaggies are indiscernible at any level - and then I’ll enable AA as a kind of ultra fine dither. I know that sounds directly contradicting, but it makeS sense to me.
Edit: it’s important to remember that this is a game and not a still photograph. If the game is less a game and more an interactive film, then maybe I’ll leave aa on. But generally I like to play games when I want to play games, and watch stuff when I want to watch stuff. Good questions to ask yourself when playing a “game” is: “Is this actually a game? Does it have a fail-state? What is the fail-state?” A lot of “games” these days are barely or not even games, and even within the game category, there’s a lot of what I’m now calling Darkslop, which is garbage games filled with dark patterns by the devs. Most “live service” “games” are Darkslop
I still use a 1080p monitor myself, so I’m almost certainly benefiting from AA more than your 1440p or someone else with 4k, so I can see why you would prefer it off since it’s not quite as needed anyway. I imagine at 4k on a smaller-ish screen AA may not be needed at all.
But generally I like to play games when I want to play games
As I’ve begin to try out older consoles for the first time that only output 480p (like the Wii), the aliasing on a modern TV can sometimes be distracting enough to detract from the experience, something an old CRT with its natural anti-aliasing probably would’ve been less noticeable.
I agree that raw resolution is king, but for those of us with lower resolution hardware, I don’t feel like anti-aliasing makes a game less of a game, personally.
I think you may have misunderstood my intent of discussing “no true games”. It’s not that AA makes a game less of a game, that’s a wild take… But that when I’m playing a game where I need any sort of reaction speed or, especially framerate, AA is not only resource hungry (ESPECIALLY on older games/hardware) that leads to less fps, but that it also decreases visual intelligibility of a game when in motion by having it never be crisp and perfect. Whereas slow or non moving game cameras look more “photoreal” which is not a quality that benefits the playing of a game, and is generally strongly associated with selling people on looks and visual fidelity over fun game-content.
Rereading it again, I absolutely did misread you there, and conflated what you were saying about mostly cinematic games to AA in general, my bad!
I guess for me I don’t really worry about maximizing performance or raw FPS in a game unless it’s an online competitive shooter with team mates who would get annoyed at me if low performance was making me a hindrance.
For older games, I usually crank up the graphics options since my more modern hardware can easily max it out, and most single player games I play don’t seem to require the best possible frame rates to play well, so I prefer the smoother edges as long as I can keep it above 60fps :p
Mouse acceleration off, mouse sensitivity all the way down, mouse smoothing off, raw mouse input on, dlss off, resolution scale 100%+, anti aliasing off, invert mouse y-axis on.
Why turn aa off?
It makes everything blurry
That usually depends on the specific type of AA.
FXAA (Fast Approximate) is known for looking like vaseline smeared on the screen (used because it doesn’t give much performance hit), but other types are usually effective at removing aliasing on sharp corners without adding a blurring effect.
SSAA (Super Sampling) applies to the entire screen, and in a way is sort’ve rendering the whole game at a higher resolution and scaling it down to your monitor resolution. Downside is that can bring performance down too much if you’re hardware isn’t up for it.
MSAA (Multi Sampling) focuses just on the edges of things, where it’s most likely that the jaggy aliased pixels would be most noticeable. This is much easier for your computer to do, since it’s not doing the entire screen.
Of those, only FXAA should be blurry. However, there are now Temporal AA, Like DLSS, or DLAA. These can introduce a motion-blur effect during movement.
Lastly, FSR upscales a lower resolution to a higher one, which can make things blurry if the base image is too low of a resolution and lacks enough information to guess how to upscale it.
It depends on which method of AA to large varying degrees, But they all blur, as is literally their function. For whatever reason, at 1440p, my eyes and brain overwhelmingly recognize shapes and depth better and way faster with no AA. To me, applying DLAA is like another layer of filter over what I’m seeing. My goal is to have a level of pixel density per degree of field of vision where pixels or jaggies are indiscernible at any level - and then I’ll enable AA as a kind of ultra fine dither. I know that sounds directly contradicting, but it makeS sense to me.
Edit: it’s important to remember that this is a game and not a still photograph. If the game is less a game and more an interactive film, then maybe I’ll leave aa on. But generally I like to play games when I want to play games, and watch stuff when I want to watch stuff. Good questions to ask yourself when playing a “game” is: “Is this actually a game? Does it have a fail-state? What is the fail-state?” A lot of “games” these days are barely or not even games, and even within the game category, there’s a lot of what I’m now calling Darkslop, which is garbage games filled with dark patterns by the devs. Most “live service” “games” are Darkslop
I still use a 1080p monitor myself, so I’m almost certainly benefiting from AA more than your 1440p or someone else with 4k, so I can see why you would prefer it off since it’s not quite as needed anyway. I imagine at 4k on a smaller-ish screen AA may not be needed at all.
As I’ve begin to try out older consoles for the first time that only output 480p (like the Wii), the aliasing on a modern TV can sometimes be distracting enough to detract from the experience, something an old CRT with its natural anti-aliasing probably would’ve been less noticeable.
I agree that raw resolution is king, but for those of us with lower resolution hardware, I don’t feel like anti-aliasing makes a game less of a game, personally.
I think you may have misunderstood my intent of discussing “no true games”. It’s not that AA makes a game less of a game, that’s a wild take… But that when I’m playing a game where I need any sort of reaction speed or, especially framerate, AA is not only resource hungry (ESPECIALLY on older games/hardware) that leads to less fps, but that it also decreases visual intelligibility of a game when in motion by having it never be crisp and perfect. Whereas slow or non moving game cameras look more “photoreal” which is not a quality that benefits the playing of a game, and is generally strongly associated with selling people on looks and visual fidelity over fun game-content.
Rereading it again, I absolutely did misread you there, and conflated what you were saying about mostly cinematic games to AA in general, my bad!
I guess for me I don’t really worry about maximizing performance or raw FPS in a game unless it’s an online competitive shooter with team mates who would get annoyed at me if low performance was making me a hindrance.
For older games, I usually crank up the graphics options since my more modern hardware can easily max it out, and most single player games I play don’t seem to require the best possible frame rates to play well, so I prefer the smoother edges as long as I can keep it above 60fps :p