What is the recommended frame rate for optimal viewing?

  • Thread starter Thread starter rohanprabhu
  • Start date Start date
  • Tags Tags
    Fps
AI Thread Summary
Television standards typically operate at 50/60 fps, while action games are best experienced at 70-80 fps for optimal performance. Despite the human eye's ability to retain images for a brief moment, flutter can be detected even at frame rates as high as 30 fps. The resolution limit of the human eye suggests that pixels must be small enough to avoid detection of individual differences, with estimates indicating around 600 pixels per inch for optimal clarity. High frame rates in computer animations can appear unnatural due to the absence of motion blur, which is present in real-life filmed action. Overall, achieving a balance between frame rate and visual fidelity is crucial for an immersive viewing experience.
rohanprabhu
Messages
410
Reaction score
2
Some television standards show images at 50/60 fps. Action games are meant to be played at 70-80 fps. But, if our eyes maintain the image only for like 1/6th of a second.. why is that we can detect flutter even at 24 fps in some cases [sometimes flutter is visible even at 30 fps or something].

also, what is the resolution power of our eye. This question is related to the maximum dpi we need achieve. There has to be some limit of the eye, for example a measurement in length. If two particles of smaller sizes are kept side to side, the eye cannot detect the difference between the two particles. Like, if the limit is 'x' dpi, then if i keep two pixels at a resolution of more than 'x' dpi, the eye cannot determine the difference.
 
Computer science news on Phys.org
you might find an answer here: http://www.daniele.ch/school/30vs60/30vs60_1.html
 
Last edited by a moderator:
rohanprabhu said:
Some television standards show images at 50/60 fps. Action games are meant to be played at 70-80 fps. But, if our eyes maintain the image only for like 1/6th of a second.. why is that we can detect flutter even at 24 fps in some cases [sometimes flutter is visible even at 30 fps or something].

As far as i know, television shows at about 30 fps; I know that DVDs are 29.97 fps. 60 frames per second is about the maximum framerate our eyes can process, anything above tends to be imperceptible to us.

As for action video games, you will get maximum visuals at about 60 frames per second, anything above that is just for bragging rights on the power of your computer.

rohanprabhu said:
also, what is the resolution power of our eye. This question is related to the maximum dpi we need achieve. There has to be some limit of the eye, for example a measurement in length. If two particles of smaller sizes are kept side to side, the eye cannot detect the difference between the two particles. Like, if the limit is 'x' dpi, then if i keep two pixels at a resolution of more than 'x' dpi, the eye cannot determine the difference.

I'm not sure there is a hard set number for this, but this website has some interesting conclusions:

http://www.clarkvision.com/imagedetail/eye-resolution.html

Clarkvision Photography said:
Visual Acuity and Resolving Detail on Prints

How many pixels are needed to match the resolution of the human eye? Each pixel must appear no larger than 0.3 arc-minute. Consider a 20 x 13.3-inch print viewed at 20 inches. The Print subtends an angle of 53 x 35.3 degrees, thus requiring [10600 x 7000] pixels, for a total of ~74 megapixels to show detail at the limits of human visual acuity.

The 10600 pixels over 20 inches corresponds to 530 pixels per inch, which would indeed appear very sharp. Note in a recent printer test I showed a 600 ppi print had more detail than a 300 ppi print on an HP1220C printer (1200x2400 print dots). I've conducted some blind tests where a viewer had to sort 4 photos (150, 300, 600 and 600 ppi prints). The two 600 ppi were printed at 1200x1200 and 1200x2400 dpi. So far all have gotten the correct order of highest to lowest ppi (includes people up to age 50).
 
It's difficult to assign hard numbers because there is a lot of processing by your brain.
Films run at 24fps but show each frame twice so really show 48 images/sec this was a compromise worked out in the early days of film between using too much film and appearing jerky.
TV shows fields, half frames consisting of every odd or even row alternatively - so it shows 50 (Europe) or 60 (US) pictures per second but only 25 / 30 full frames. In the same way as film this reduces the jerky effect while saving bandwidth.

Computer monitors show 60 full frames/sec so should be better than either TV or film - except for a couple of physiological effects.
You have much better flicker perception out of the corner of your eye (you are descended from a long line of ancestors that saw something moving out of the corner of their eye in time to run away) and you can see flicker on bright obects more easily. So sitting up close to a bright, high contrast computer screen you see flicker more easily, this is made worse if you are also under electric lights that flicker at the same frequency.

In addition computer animation looks jerky even at higher frame rates because the images are all perfectly still whereas in real filmed action each image is blurred by motion. Even at 120 fps computer animations of moving objects look wrong because of this. Motion picture effects dliberately motion blurr moving objects but this is beyond the processing power of most games.
 
mgb_phys said:
Computer monitors show 60 full frames/sec so should be better than either TV or film - except for a couple of physiological effects
...So sitting up close to a bright, high contrast computer screen you see flicker more easily, this is made worse if you are also under electric lights that flicker at the same frequency.

This isn't strictly true, in fact most computer monitors are purposely faster than 60Hz because they are trying to avoid the perceptible flicker effect. Most high-end CRT monitors can display at least 85Hz at full resolution, and some will go higher than 100Hz depending on the video adapter.

LCD monitors (and televisions these days) are usually rated by a response time in milliseconds or refresh rate in Hz. A very fast LCD moitor or TV will have a 5-6ms response time (200-166Hz), not because they are trying to avoid flicker but rather they are trying to avoid motion blurring.

mgb_phys said:
In addition computer animation looks jerky even at higher frame rates because the images are all perfectly still whereas in real filmed action each image is blurred by motion. Even at 120 fps computer animations of moving objects look wrong because of this. Motion picture effects dliberately motion blurr moving objects but this is beyond the processing power of most games.

On the contrary, a computer animation going at 120fps should look perfectly smooth... I wouldn't say "jerkiness" is a trait of computer animations these days, but a big factor in people being able to tell if something looks fake is the lack of imperfections.

Digitally enhanced TV shows and movies have been making a big push to make sure special effects look as real as possible, and a lot of this means adding in fake defects that would be seen in the real thing like depth of field and camera shake. The TV show Firefly was a very interesting example of this- almost all of their CGI shots purposely had something wrong with them. From an out-of-focus subject that is quickly corrected, to camera shake and misframed shots. Indeed, it seems it is the imperfections in a video that help make it seem real.
 
Sorry, I meant to say a "minimum of 60hz" - ie, a computer monitor running at 60hz will be horrible and flickery to use, even though it is faster than TV.
Jerky is the wrong word - an animation of 120fps with perfect images will look 'wrong' because of the lack of motion blur.
Interesting point, haven't seen firefly but a couple of books on CGI mentioned simulating camera optical abberations and adding atmospheric fog to make images look more 'real' or at least cinematic.

Droidworks (an excellent history of computer graphics and pixar) mentioned that a problem with CGI for the star wars movies was that the space ships looked too clean but they didn't have the processing power to render dirt and stains on the surfaces.
 
Last edited:
mgb_phys said:
Jerky is the wrong word - an animation of 120fps with perfect images will look 'wrong' because of the lack of motion blur.

Well, it really depends on how fast the subject in the scene is moving, not how many frames per second the scene is being rendered at. A scene of a stationary object which is being rendered at 120fps should look fine. Motion blur doesn't really have anything to do with the rendered frame rate of something, it has to do with the intergration time of whatever might be capturing it (your eye, a video camera). Simulated motion blur is one method of simulating an imperfection that might be seen with a filmed sequence.

mgb_phys said:
Interesting point, haven't seen firefly...

You disgust me :P I highly recommend it, and the movie Serenity which basically finishes off what begins in the Firefly T.V. show, which was canceled after its first season (why do stations always cancel the T.V. shows I like?!).


mgb_phys said:
... but a couple of books on CGI mentioned simulating camera optical abberations and adding atmospheric fog to make images look more 'real' or at least cinematic.

In the Firefly DVD set, the behind the scenes comentary talks about how they used technically "bad" cinematography to achieve their desired effect on the show such as correcting focus and camera shake. They were careful to include the problems in both the filmed live action and the CGI shots to maintain continuity and a "real" feel.

mgb_phys said:
Droidworks (an excellent history of computer graphics and pixar) mentioned that a problem with CGI for the star wars movies was that the space ships looked too clean but they didn't have the processing power to render dirt and stains on the surfaces.

Except that space ships wouldn't necessarily have dirt on them :cool: I did notice that the re-released "special edition" versions of episodes 4,5, and 6 had somewhat real looking ships in that they had burn marks and stuff on them.
 
Last edited:
60 fps may mean fields per second. A field is half of the scanned lines on a screen, so that might be it. But yeah, like others said. TV is 30 f(rames)ps and the big screen movie theater is 24.
 
Back
Top