maverick_starstrider said:
Well let's be clear here. We're talking about 60 fps VIDEO. I could see how a 30 fps graphic rendering could stutter if the frames actually came inconsistently and thus you could have the occassional large gap between frames.
No, I'm talking about even, consistent 30fps rendering. No skips, just stuttering due to your eyes being able to see the motion.
And remember, a pretty big fraction of movies today have graphic rendering in them.
Also, while motion blur will help, motion blur is built-in to the movie (or not!) and doesn't necessarily match what your eyes would generate. I once sat waaay too close to a movie at a theater and the built-in motion blurring was not sufficient to make the video smooth: it just crossed way too much field of view too fast.
Here's a good article with an explanation and a downloadable demo. I tried it: it is virus free and appears to work, though you do need to ensure your video hardware is set up right and capable of running some intense graphics.
http://www.tweakguides.com/Graphics_5.html
What I'm concerned with is if you are playing a movie (either SD or HD) with 30 fps where each frame is always evenly spaced, and you're playing the same movie with 60 fps. Weren't the first video cameras hand cranked? So I feel like we ARRIVED at 30 fps (or 32 fps) BECAUSE it seemed optimal, all those years ago.
Well, "optimal, all those years ago" doesn't necessarily imply 'optimal, today'. I can think of a number of other potentail reasons 30 (or 24?) was "optmial" (vs a higher number) that may or may not have anything to do with our eyes or a digital TV for that matter.
-Persistence of vision is different in a dark movie theater than in a bright living room.
-Cost of film.
-Reliability of high speed cameras and projectors.
-Light sensitivity of film.
-Light (and heat) generation of projectors.
-Quality and resolution of the images.