It depends on which method of AA to large varying degrees, But they all blur, as is literally their function. For whatever reason, at 1440p, my eyes and brain overwhelmingly recognize shapes and depth better and way faster with no AA. To me, applying DLAA is like another layer of filter over what I’m seeing. My goal is to have a level of pixel density per degree of field of vision where pixels or jaggies are indiscernible at any level - and then I’ll enable AA as a kind of ultra fine dither. I know that sounds directly contradicting, but it makeS sense to me.
Edit: it’s important to remember that this is a game and not a still photograph. If the game is less a game and more an interactive film, then maybe I’ll leave aa on. But generally I like to play games when I want to play games, and watch stuff when I want to watch stuff. Good questions to ask yourself when playing a “game” is: “Is this actually a game? Does it have a fail-state? What is the fail-state?” A lot of “games” these days are barely or not even games, and even within the game category, there’s a lot of what I’m now calling Darkslop, which is garbage games filled with dark patterns by the devs. Most “live service” “games” are Darkslop
I still use a 1080p monitor myself, so I’m almost certainly benefiting from AA more than your 1440p or someone else with 4k, so I can see why you would prefer it off since it’s not quite as needed anyway. I imagine at 4k on a smaller-ish screen AA may not be needed at all.
But generally I like to play games when I want to play games
As I’ve begin to try out older consoles for the first time that only output 480p (like the Wii), the aliasing on a modern TV can sometimes be distracting enough to detract from the experience, something an old CRT with its natural anti-aliasing probably would’ve been less noticeable.
I agree that raw resolution is king, but for those of us with lower resolution hardware, I don’t feel like anti-aliasing makes a game less of a game, personally.
I think you may have misunderstood my intent of discussing “no true games”. It’s not that AA makes a game less of a game, that’s a wild take… But that when I’m playing a game where I need any sort of reaction speed or, especially framerate, AA is not only resource hungry (ESPECIALLY on older games/hardware) that leads to less fps, but that it also decreases visual intelligibility of a game when in motion by having it never be crisp and perfect. Whereas slow or non moving game cameras look more “photoreal” which is not a quality that benefits the playing of a game, and is generally strongly associated with selling people on looks and visual fidelity over fun game-content.
Rereading it again, I absolutely did misread you there, and conflated what you were saying about mostly cinematic games to AA in general, my bad!
I guess for me I don’t really worry about maximizing performance or raw FPS in a game unless it’s an online competitive shooter with team mates who would get annoyed at me if low performance was making me a hindrance.
For older games, I usually crank up the graphics options since my more modern hardware can easily max it out, and most single player games I play don’t seem to require the best possible frame rates to play well, so I prefer the smoother edges as long as I can keep it above 60fps :p
It depends on which method of AA to large varying degrees, But they all blur, as is literally their function. For whatever reason, at 1440p, my eyes and brain overwhelmingly recognize shapes and depth better and way faster with no AA. To me, applying DLAA is like another layer of filter over what I’m seeing. My goal is to have a level of pixel density per degree of field of vision where pixels or jaggies are indiscernible at any level - and then I’ll enable AA as a kind of ultra fine dither. I know that sounds directly contradicting, but it makeS sense to me.
Edit: it’s important to remember that this is a game and not a still photograph. If the game is less a game and more an interactive film, then maybe I’ll leave aa on. But generally I like to play games when I want to play games, and watch stuff when I want to watch stuff. Good questions to ask yourself when playing a “game” is: “Is this actually a game? Does it have a fail-state? What is the fail-state?” A lot of “games” these days are barely or not even games, and even within the game category, there’s a lot of what I’m now calling Darkslop, which is garbage games filled with dark patterns by the devs. Most “live service” “games” are Darkslop
I still use a 1080p monitor myself, so I’m almost certainly benefiting from AA more than your 1440p or someone else with 4k, so I can see why you would prefer it off since it’s not quite as needed anyway. I imagine at 4k on a smaller-ish screen AA may not be needed at all.
As I’ve begin to try out older consoles for the first time that only output 480p (like the Wii), the aliasing on a modern TV can sometimes be distracting enough to detract from the experience, something an old CRT with its natural anti-aliasing probably would’ve been less noticeable.
I agree that raw resolution is king, but for those of us with lower resolution hardware, I don’t feel like anti-aliasing makes a game less of a game, personally.
I think you may have misunderstood my intent of discussing “no true games”. It’s not that AA makes a game less of a game, that’s a wild take… But that when I’m playing a game where I need any sort of reaction speed or, especially framerate, AA is not only resource hungry (ESPECIALLY on older games/hardware) that leads to less fps, but that it also decreases visual intelligibility of a game when in motion by having it never be crisp and perfect. Whereas slow or non moving game cameras look more “photoreal” which is not a quality that benefits the playing of a game, and is generally strongly associated with selling people on looks and visual fidelity over fun game-content.
Rereading it again, I absolutely did misread you there, and conflated what you were saying about mostly cinematic games to AA in general, my bad!
I guess for me I don’t really worry about maximizing performance or raw FPS in a game unless it’s an online competitive shooter with team mates who would get annoyed at me if low performance was making me a hindrance.
For older games, I usually crank up the graphics options since my more modern hardware can easily max it out, and most single player games I play don’t seem to require the best possible frame rates to play well, so I prefer the smoother edges as long as I can keep it above 60fps :p