
it hides low quality animation
Depends on the game. Is it a game where I’m there for the scenery? Crank it all up. I want to see the world they created.
Is it a competitive game? I change everything for best balance of FPS and opponent visibility without making it too ugly. Motion blur, camera shake, film grain, music, cam animations like a roll after jumping from a height, all that unneeded stuff is immediately gone. I spend a couple days digging through recommended configs online to find the right changes that suit me.
Lol motion blur makes it harder to see the scenery lol.
Why would you want motion blur in a game where looking at the scenery is the appeal?
In first-person games, FOV 100°. Much less than this and it feels impossible to see what’s going on, like I’m looking through binoculars.
For VR games, I always have to turn off the “comfort settings” (vignette, snap turn, etc). They’re super jarring, and make me feel sick almost immediately.
That stopped me from playing OG borderlands for a long time til I found the FOV fix. Its always first on my list. Motion blur being second.
God fucking damn do i hate bloom stop putting bloom dialed up to a million in your game i want to SEE the fucking GAME not the SUN
I love bloom. Put bloom in more games. I want my game world to glow.

Just put the grease on your own eyeballs and sit really close.
I’ve already got astigmatism I don’t need any more halos on lights when my eyes do it automatically :D
Oh hold on. Lens flares though, I love lens flares.
See now that’s just a cinematic choice
Hahaha oh yes.
Agreed. Motion blur, chromatic aberration, film grain, and other variants on the theme I just don’t like.
Along with “CRT filter” in some retro-style games. I am okay with them being optional, I just wish they were off by default.
That’s actually the one I’m entirely on board with
Many retro games were designed to be viewed on a composite (or worse) signal CRT (particularly 8/16-bit consoles). They take advantage of the characteristics of those technologies to act as a final expected phase of image “processing”. (It’s a physical effect so not actually processing)
The games were never meant to be played with sharp, hard pixels. The lines were supposed to blur a bit to create a sum greater than the parts and create additional chroma and luma resolution that isn’t possible with the console hardware in isolation.
OTOH it actually has to be a good filter that mimics these characteristics correctly, if it’s just basic 1px scanlines and nothing else I’m probably not gonna use it
CRT filters are just fake scanlines for nostalgia. Blurring the screen does not accurately recreate the way the games were meant to be viewed. This is because CRTs are analog and don’t have discrete pixels at all, the color posphors can be partially lit, resulting in a better looking image. That just can’t be recreated with a filter.
I wonder if you could create a shader that would do the job?
Bad ones are yes, I addressed that at the end
The good ones make use of the higher resolution screens we have today to render the lower resolution images using the real pixels as subpixels to mimic the effect.
1080p is possibly not enough resolution to be convincing if that’s what you last looked at; but at 4K, every 240i/480i console output pixel gets something like 8-16 real subpixels to work with
If you want 1 to 1 accuracy, yes you’ll only get that with an actual CRT. But the modern high quality filters are much, much more than just fake scanlines, and can be pretty effective for the games that need them. You can usually choose the signal path to emulate, choose to use an aperture grille or different kinds of shadow mask, and often even deeper tweaks.
If you’ve not looked at them in a while and you’ve got appropriate hardware to run them well, have another look.
I last looked at a real CRT because it is my only screen 😂
Well fair play then! You’re probably not gonna get much out of them then haha
I like it on Raccoin because it stylizes the game more than anything, and can give it a lower-res feel without making the UI elements painful.
I grew up with the CRT scan lines and honestly I prefer the hard pixels. Don’t know I think it just triggers the nostalgia better than the muddied lines that look like a bad attempt at upscaling.
Fair play, people’s preferences are ultimately a subjective thing
I would say when it’s done properly it genuinely looks like the devs have found resolution that doesn’t exist in the image
Here’s a good example that I think demonstrates it clearly (though this is a real deal CRT in this image versus the kind of filters I’m talking about in this thread, though the goal is the same effect)

(Via https://mastodon.social/@ponysmasher/111025666005999438)
Oh, don’t get me wrong. I give them credit for being able to make the pixel art look much better than was intended, but I think my brain fills in the detail without wanting a filter to do it. Or I have yet to see a filter that is as good as a CRT and I’m not about to go out and buy one to get that experience.
I always turn subtitles on. I’ve played a handful of games with spatial audio, where NPCs are doing the walk and talk thing, sometimes being far enough away I can’t hear. Subtitles make sure I don’t miss the silly filler dialogue.
Also always invert the camera vertically. Early on, my brother explained how GoldenEye or Pilotwings or something, up looks down and down looks up. Now I’m married to that idea years later. Mainly in first person. I forget which way is natural in 3rd person, so I just invert for consistency there.
I invert y-axis in first person, but in third person with a joystick i invert x-axis too. The way that makes sense to me is the joystick controls the orbit of the camera. I know this is insane, but like you, I’m also married to the idea. I think maybe it depends on if it’s a shooter or an arpg like kingdom hearts where one joystick moves the character and the other joystick moves the camera.
Screen shake gets turned off. If there’s a setting for flashing or strobe that gets turned off as well.
It makes it more “cinematic.”
Literally the only reason for it to exist I ever heard from game designers.
As far as I am concerned, it does not need to exist; your eyes and brain will create motion blur even without the game itself making it blurry when you move.
I also hate depth of field (again, brain will do this automatically it doesn’t need to be simulated), film grain, chromatic aberration (unless I am a robot or literally using a camera), and vignetting (seriously, why does everything have to have a vignette? It only makes sense in a 1st person game where there is something on my face, but they throw this shit onto everything. RDR2 can’t even turn it off so thr corners of the screen are basicslly just black the whole time and it looks ugly).
Depth of field prevents assets from loading unnecessarily.
Depth of Field is the blurring effect to draw focus onto an object. Like when you ADS and everything except the area around the gun in your hand becomes blurred. The assets are already visible and loaded, and the effect is a shader.
You might be thinking LOD distance or FOV.
It can be used to disguise low LODs too.
I might be thinking of LOD. Though, it also could have just been something like draw distance. Not like games are 100% consistent.
Draw distance usually effects whether something is rendered in the distance at all. LOD (level of detail) allows for higher detailed models (higher textured, more polygons) to be swapped with lower detailed versions of the same model as their distance increases. Adjusting an LOD slider would usually determine how far a model can get before it starts progressively swapping to the lower detail ones.
Mouse acceleration off, mouse sensitivity all the way down, mouse smoothing off, raw mouse input on, dlss off, resolution scale 100%+, anti aliasing off, invert mouse y-axis on.
Why turn aa off?
It makes everything blurry
That usually depends on the specific type of AA.
FXAA (Fast Approximate) is known for looking like vaseline smeared on the screen (used because it doesn’t give much performance hit), but other types are usually effective at removing aliasing on sharp corners without adding a blurring effect.
SSAA (Super Sampling) applies to the entire screen, and in a way is sort’ve rendering the whole game at a higher resolution and scaling it down to your monitor resolution. Downside is that can bring performance down too much if you’re hardware isn’t up for it.
MSAA (Multi Sampling) focuses just on the edges of things, where it’s most likely that the jaggy aliased pixels would be most noticeable. This is much easier for your computer to do, since it’s not doing the entire screen.
Of those, only FXAA should be blurry. However, there are now Temporal AA, Like DLSS, or DLAA. These can introduce a motion-blur effect during movement.
Lastly, FSR upscales a lower resolution to a higher one, which can make things blurry if the base image is too low of a resolution and lacks enough information to guess how to upscale it.
It depends on which method of AA to large varying degrees, But they all blur, as is literally their function. For whatever reason, at 1440p, my eyes and brain overwhelmingly recognize shapes and depth better and way faster with no AA. To me, applying DLAA is like another layer of filter over what I’m seeing. My goal is to have a level of pixel density per degree of field of vision where pixels or jaggies are indiscernible at any level - and then I’ll enable AA as a kind of ultra fine dither. I know that sounds directly contradicting, but it makeS sense to me.
Edit: it’s important to remember that this is a game and not a still photograph. If the game is less a game and more an interactive film, then maybe I’ll leave aa on. But generally I like to play games when I want to play games, and watch stuff when I want to watch stuff. Good questions to ask yourself when playing a “game” is: “Is this actually a game? Does it have a fail-state? What is the fail-state?” A lot of “games” these days are barely or not even games, and even within the game category, there’s a lot of what I’m now calling Darkslop, which is garbage games filled with dark patterns by the devs. Most “live service” “games” are Darkslop
I still use a 1080p monitor myself, so I’m almost certainly benefiting from AA more than your 1440p or someone else with 4k, so I can see why you would prefer it off since it’s not quite as needed anyway. I imagine at 4k on a smaller-ish screen AA may not be needed at all.
But generally I like to play games when I want to play games
As I’ve begin to try out older consoles for the first time that only output 480p (like the Wii), the aliasing on a modern TV can sometimes be distracting enough to detract from the experience, something an old CRT with its natural anti-aliasing probably would’ve been less noticeable.
I agree that raw resolution is king, but for those of us with lower resolution hardware, I don’t feel like anti-aliasing makes a game less of a game, personally.
I think you may have misunderstood my intent of discussing “no true games”. It’s not that AA makes a game less of a game, that’s a wild take… But that when I’m playing a game where I need any sort of reaction speed or, especially framerate, AA is not only resource hungry (ESPECIALLY on older games/hardware) that leads to less fps, but that it also decreases visual intelligibility of a game when in motion by having it never be crisp and perfect. Whereas slow or non moving game cameras look more “photoreal” which is not a quality that benefits the playing of a game, and is generally strongly associated with selling people on looks and visual fidelity over fun game-content.
Rereading it again, I absolutely did misread you there, and conflated what you were saying about mostly cinematic games to AA in general, my bad!
I guess for me I don’t really worry about maximizing performance or raw FPS in a game unless it’s an online competitive shooter with team mates who would get annoyed at me if low performance was making me a hindrance.
For older games, I usually crank up the graphics options since my more modern hardware can easily max it out, and most single player games I play don’t seem to require the best possible frame rates to play well, so I prefer the smoother edges as long as I can keep it above 60fps :p
Motion blur always off, no exceptions.
Vsync always off, no exceptions.
I’m really curious why you would turn vsync off. I know in competitive gaming it can possibly be a detriment, but aside from a very tiny amount of delay what is the downside?
In my experience it causes a very large amount of delay in almost every game I have used it on. Even in a casual gaming setting it can make playing feel a lot worse.
Yup, motion blur goes off, bloom gets cranked way, way down (a tiny bit I’m ok with), chromatic aberration, film grain, or basically any filter meant to imitate the effects of older camera technology gets shit-canned. FoV usually needs to get cranked up, music volume is pushed down to 50% or lower. After that it’s usually a dance of getting mouse/controller sensitivity adjusted and min-maxing the graphics settings a bit to have good graphics and ok framerates. Then there is the final key-bindings adjustments, though those often take a while of gameplay to sort out.
Motion blur adds a touch of realism to images as it mimics how a camera sensor captures motion according to its shutter speed. It also helps imitating how human eyes work, try to wave your hand in front of your face and you’ll see something very similar to motion blur.
My human eyes create motion blur just by objects moving on the screen, there is no need for additional blur.
I don’t see how it adds realism.
My human eyes create motion blur just by objects moving on the screen, there is no need for additional blur.
They don’t because those objects don’t move. You look at static images. Objects that would move fast enough to create a motion blur for your eyes rather create weird optical illusions.
Like, I get that you dislike the fake motion blur effect, so do I. But your eyes don’t compensate as the screens and games have refresh rates and don’t move anything in between frames.
Nothing’s moving on the screen. You’re reconstructing movement from a series of still images.
Motion blur—when it’s implemented with understanding of why it works—provides additional information about speed and direction of movement in a way that your brain is naturally apt to deal with.
Without motion blur, you need at least two frames to figure out direction of movement. With it, you’ll have enough information to start priming response in one frame. And consistent pattern of priming actions and successfully executing them translates into a physically satisfying experience.
I also dislike motion blur and turn it off because it gives me a headache, but let’s not pretend that motion blur doesn’t do anything at all, because it does.
Motion blur happens in real life when an object moves quickly through our field of view and the image smears on our retina. But the effect is different depending on speed and distance; a car whizzing right past you appears blurred, but watching that same car at the same speed when you are standing 200 meters away is not blurred, because the car occupies less of your field of view and so moves relatively slower across it - slow enough to not blur.
To get one thing out of the way, old LCD monitors especially (but even some newer monitors) can intrinsically suffer from what looks like motion blur, which is when the monitor can’t update quick enough and moving objects leave a kind of ghosting or smearing effect behind them. High quality screens however will have very little noticeable intrinsic blurring.
Assuming a decent screen then, motion doesn’t intrinsically blur - not always. Sufficiently quick events will indeed still look blurred due to natural human eyeball motion blur, but lots of things that would blur in real life might not. The car whizzing right past you won’t necessarily appear blurred at all, because what in the game represents to your player character a very close and fast movement is on screen perhaps relatively much slower and further away from the perspective of the human sat at the desk looking at it on a monitor, and so doesn’t read as blurred to our eyes in the same way.
So, motion blur in games is an attempt to take movements that would look blurry in real life, and apply artificial blur to make them blurry on the screen.
I don’t personally like in-game motion blur, and even if it didn’t give me a headache I’d still turn it off for stylistic preference reasons, but it is a real thing, and does try to achieve something specific.
High quality motion blur actually looks quite decent. Almost no game has that tho
✨Cinematic✨










