Always the first thing I turn off, but surely there are some people out there that actually like it. If you’re one of those people is there a particular reason?

  • stevestevesteve@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    7
    ·
    6 days ago

    Motion blur in film does that, but with video games, in every implementation I’ve seen, you don’t get a blur that works the same way. Movies will generally blur 50% of the motion between frames (a “180 degree shutter”), a smooth blur based on motion alone. Video games generally just blur multiple frames together (sometimes more than two!) leaving all of the distinct images there, just overlayed instead of actually motion blurred. So if something moved from one side of the screen all the way to the other within a single frame, you get double vision of that thing instead of it just being an almost invisible smear across the screen. To do it “right” you basically have to do motion interpolation first, then blur based on that, and if you’re doing motion interpolation you may as well just show the sharp interpolated mid frames.

    On top of that, motion blur tends to be computationally very expensive and you end up getting illegible 30fps instead of smooth 60+.

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      6 days ago

      This is not how motion blur works at all. Is there a specific game you’re taking about? Are you sure this is not monitor ghosting?

      Motion blur in games cost next to no performance. It does use motion data but not to generate in between frames, to smear the pixels of the existing frame.