I have a 1920x1080 monitor and a powerful enough rig to playably run any of my games on that resolution, even Crysis. I heard that AA (anti-aliasing) isn't necessary at high resolutions. And that adding AA at high-res is almost completely unnoticeable, and it only murders your framerate. The only function of AA is to make jagged edges appear smoother. I believe high resolutions has this same effect, eliminating the need for AA. Although high-res AA might have some effect if you look very close. But otherwise it is mostly unnoticeable. I play games on 1080p with 0x AA and I do not see any jagged edges. But when I reduce the resolution, the jagged edges become very obvious. Anisotropic filtering (AF) is a different story. So low resolution + AA or high resolution - AA? Not to mention that turning off all AA (especially at high-res) significantly improves your framerates.