High-resolution gaming reduces the need for anti-aliasing?

  1. I have a 1920x1080 monitor and a powerful enough rig to playably run any of my games on that resolution, even Crysis. I heard that AA (anti-aliasing) isn't necessary at high resolutions. And that adding AA at high-res is almost completely unnoticeable, and it only murders your framerate.

    The only function of AA is to make jagged edges appear smoother. I believe high resolutions has this same effect, eliminating the need for AA. Although high-res AA might have some effect if you look very close. But otherwise it is mostly unnoticeable.

    I play games on 1080p with 0x AA and I do not see any jagged edges. But when I reduce the resolution, the jagged edges become very obvious.
    Anisotropic filtering (AF) is a different story.

    So low resolution + AA or high resolution - AA?

    Not to mention that turning off all AA (especially at high-res) significantly improves your framerates.
     
  2. jcsd
  3. mathman

    mathman 6,521
    Science Advisor
    Gold Member

    It appears to boil down to how sharp your eyesight is. If you don't notice the aliasing, you don't need the AA software.
     
  4. I don't notice any jagged edges at 1080p, so I disable all AA. Doing this can mean the difference between getting a silky smooth 60+ fps, and getting <25 fps. Especially in games like Crysis.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?