Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why is 60 fps recommended?

  1. Jan 13, 2008 #1
    Some television standards show images at 50/60 fps. Action games are meant to be played at 70-80 fps. But, if our eyes maintain the image only for like 1/6th of a second.. why is that we can detect flutter even at 24 fps in some cases [sometimes flutter is visible even at 30 fps or something].

    also, what is the resolution power of our eye. This question is related to the maximum dpi we need achieve. There has to be some limit of the eye, for example a measurement in length. If two particles of smaller sizes are kept side to side, the eye cannot detect the difference between the two particles. Like, if the limit is 'x' dpi, then if i keep two pixels at a resolution of more than 'x' dpi, the eye cannot determine the difference.
     
  2. jcsd
  3. Jan 13, 2008 #2
  4. Jan 14, 2008 #3

    Mech_Engineer

    User Avatar
    Science Advisor
    Gold Member

    As far as i know, television shows at about 30 fps; I know that DVDs are 29.97 fps. 60 frames per second is about the maximum framerate our eyes can process, anything above tends to be imperceptible to us.

    As for action video games, you will get maximum visuals at about 60 frames per second, anything above that is just for bragging rights on the power of your computer.

    I'm not sure there is a hard set number for this, but this website has some interesting conclusions:

    http://www.clarkvision.com/imagedetail/eye-resolution.html

     
  5. Jan 14, 2008 #4

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    It's difficult to assign hard numbers because there is a lot of processing by your brain.
    Films run at 24fps but show each frame twice so really show 48 images/sec this was a compromise worked out in the early days of film between using too much film and appearing jerky.
    TV shows fields, half frames consisting of every odd or even row alternatively - so it shows 50 (Europe) or 60 (US) pictures per second but only 25 / 30 full frames. In the same way as film this reduces the jerky effect while saving bandwidth.

    Computer monitors show 60 full frames/sec so should be better than either TV or film - except for a couple of physiological effects.
    You have much better flicker perception out of the corner of your eye (you are descended from a long line of ancestors that saw something moving out of the corner of their eye in time to run away) and you can see flicker on bright obects more easily. So sitting up close to a bright, high contrast computer screen you see flicker more easily, this is made worse if you are also under electric lights that flicker at the same frequency.

    In addition computer animation looks jerky even at higher frame rates because the images are all perfectly still whereas in real filmed action each image is blurred by motion. Even at 120 fps computer animations of moving objects look wrong because of this. Motion picture effects dliberately motion blurr moving objects but this is beyond the processing power of most games.
     
  6. Jan 14, 2008 #5

    Mech_Engineer

    User Avatar
    Science Advisor
    Gold Member

    This isn't strictly true, in fact most computer monitors are purposely faster than 60Hz because they are trying to avoid the perceptible flicker effect. Most high-end CRT monitors can display at least 85Hz at full resolution, and some will go higher than 100Hz depending on the video adapter.

    LCD monitors (and televisions these days) are usually rated by a response time in milliseconds or refresh rate in Hz. A very fast LCD moitor or TV will have a 5-6ms response time (200-166Hz), not because they are trying to avoid flicker but rather they are trying to avoid motion blurring.

    On the contrary, a computer animation going at 120fps should look perfectly smooth... I wouldn't say "jerkiness" is a trait of computer animations these days, but a big factor in people being able to tell if something looks fake is the lack of imperfections.

    Digitally enhanced TV shows and movies have been making a big push to make sure special effects look as real as possible, and a lot of this means adding in fake defects that would be seen in the real thing like depth of field and camera shake. The TV show Firefly was a very interesting example of this- almost all of their CGI shots purposely had something wrong with them. From an out-of-focus subject that is quickly corrected, to camera shake and misframed shots. Indeed, it seems it is the imperfections in a video that help make it seem real.
     
  7. Jan 14, 2008 #6

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    Sorry, I meant to say a "minimum of 60hz" - ie, a computer monitor running at 60hz will be horrible and flickery to use, even though it is faster than TV.
    Jerky is the wrong word - an animation of 120fps with perfect images will look 'wrong' because of the lack of motion blur.
    Interesting point, haven't seen firefly but a couple of books on CGI mentioned simulating camera optical abberations and adding atmospheric fog to make images look more 'real' or at least cinematic.

    Droidworks (an excellent history of computer graphics and pixar) mentioned that a problem with CGI for the star wars movies was that the space ships looked too clean but they didn't have the processing power to render dirt and stains on the surfaces.
     
    Last edited: Jan 14, 2008
  8. Jan 15, 2008 #7

    Mech_Engineer

    User Avatar
    Science Advisor
    Gold Member

    Well, it really depends on how fast the subject in the scene is moving, not how many frames per second the scene is being rendered at. A scene of a stationary object which is being rendered at 120fps should look fine. Motion blur doesn't really have anything to do with the rendered frame rate of something, it has to do with the intergration time of whatever might be capturing it (your eye, a video camera). Simulated motion blur is one method of simulating an imperfection that might be seen with a filmed sequence.

    You disgust me :P I highly recommend it, and the movie Serenity which basically finishes off what begins in the Firefly T.V. show, which was cancelled after its first season (why do stations always cancel the T.V. shows I like?!).


    In the Firefly DVD set, the behind the scenes comentary talks about how they used technically "bad" cinematography to achieve their desired effect on the show such as correcting focus and camera shake. They were careful to include the problems in both the filmed live action and the CGI shots to maintain continuity and a "real" feel.

    Except that space ships wouldn't necessarily have dirt on them :cool: I did notice that the re-released "special edition" versions of episodes 4,5, and 6 had somewhat real looking ships in that they had burn marks and stuff on them.
     
    Last edited: Jan 15, 2008
  9. Jan 15, 2008 #8
    60 fps may mean fields per second. A field is half of the scanned lines on a screen, so that might be it. But yeah, like others said. TV is 30 f(rames)ps and the big screen movie theater is 24.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Why is 60 fps recommended?
  1. Laptop recommendations (Replies: 19)

  2. Laptop Recommendation (Replies: 7)

  3. 60 wiring layers (Replies: 1)

Loading...