# Motion to our eye?

## Main Question or Discussion Point

I have read somewhere that motion on TV is really a series of still pictures presented very fast in succession but forgot the minimum number of still pictures / second to just allow us to perceive motion. What is that number?

I also realise that power supplied to our homes is in the form of AC and at 50Hz where I live. Meaning tha the current varies in direction 100times a second. When such a current powers a lamp in my home, I don't detect the light flickering which I should had the AC frequency been much lower. I wonder what is the minimum frequency at which flickering can just be not seen and continous illumination is perceived.

Would the two frequency numbers in the two questions equal each other?

Related Other Physics Topics News on Phys.org
russ_watters
Mentor
It depends on a lot of factors. Movies, for example, take advantage of the fact that a bright light essentially burns itself into your eyes, and only have a framerate of 30hz. It's called persistence of vision. The same affects lights. But many people get headaches from computer monitors at 60hz.

NoTime
Homework Helper
IIRC film only has a frame rate of 22 frames per second.
Part of the secret here is that in terms of hz is that the frame is displayed in its entirity for almost the entire period. A very small fraction of the cycle is used to swap frames.

chemisttree
Homework Helper
Gold Member
Anything over about 15 to 20 images per second will appear as smooth motion or continuous illumination to most. In the case of illumination we are strictly talking about fluorescent lighting and "neon" lighting. Incandescent lighting is effectively continuous at 60 Hz. Some people can discern a flicker out of the corner of their eye at this rate for fluorescent lights.

The television scans (rasters) the screen at 60 Hz skipping a line between each full screen scan. When the scan reaches the bottom of the screen, the electron gun goes back to the top of the screen, skips down a line and continues scanning at 60Hz. The completed image is displayed after the second interlaced scan which yields 30 Hz per completed image (525 lines in two interlaced scans). The human eye flicker rate is around 15 to 20 Hz near the center of vision. It is faster near the edge of our vision (out of the corner of our eye). So TV signals don't appear to flicker because they are rastered faster than our eyes can discern. You can sometimes see flicker from a Video screen out of the corner of your eye or if objects on the screen move very quickly. Sitting further away from the screen and thus reducing its effective projection size on the back of our retina can usually help with annoying flicker. Try it yourself. Set your screen to interlaced mode at 60 Hz and sit very close to your computer screen. Concentrate on the edges of your vision. Most people can just discern a slight flicker near the edge of the screen. It really bugs some people. Reset your screen to a higher Hz setting if it does.

As far as I know, television frames are not presented in succession or swaped as they are in film. Swaping would imply that the screen is cleared between images. With television the electron gun just starts displaying the next "frame" image which is interlaced with the previous image. If the image is moving, this interlacing between images results in a blurry image being displayed every other frame at 30 Hz. The second pass clears the image resulting is a new image displayed at 30 Hz.

russ_watters
Mentor
IIRC film only has a frame rate of 22 frames per second.
There are a number of different standards: http://en.wikipedia.org/wiki/Frame_rate

I wish I had kept it, but way back when 3d gaming started to get big, 3dfx put out a little demo of a bouncing ball showing that you could easily distinguish 30fps from 60 in a computer rendering. But I wonder how much cinematic effects (anti-aliasing and motion blur) change that (they didn't exist when that demo was put out). Ie:

As that wik article says, a lot of the discrepancies are from persistence of vision vs data gaps. For example, an object that crosses your entire field of view in 1/20th of a second will appear in only one frame of a movie and there will be no motion associated with it. In reality, if your eyes tried to view such an event, you'd just see a blurry streak.

Last edited:
If our eyes register motion with images between 15-20 per second than we won't notice the flickering of light from an AC supply at 50Hz. which would be equivalent to 100 images per second since the current changes direction 2 times a period. However, I am short sighted and when I take off my glasses, things become very big. Street lights seem to be mini fireworks size. Once I looked out the door and saw a light brightening and darkening. Then I put my glasses on and the flashes stoped, it was a coherent stream of light from a street lamp. The current it receives comes from an AC supply. I wonder why I was able to detect the flashes without my glasses.

One curious feature I noticed was when looking at a television that is filming a computer screen, I can see the flickering on the computer screen, sometimes very well but when I am seeing the computer screen directly, no flicker is apparent. Maybe it's because the recorder records at a rate which is close to the screen flicker rate. We see the difference between the rates on teh monitor which is very low and slow enough for eyes to detect flickering.

chemisttree
Homework Helper
Gold Member
Brightening and darkening how fast? I assume that the light you saw without your glasses was out of focus? A defocussed image might be dimmer and allow you to see more slowly fluctuating sources more distinctly than a very bright, tightly focussed image. Try looking at the fluickering light through dark prescription glasses and see if you still see the fluctuation.

Either that or somehow your glasses are controlling the flickering light. You could be a superhero... (haven't figured out how to use emoticons yet)

Or it could be that I am seing bits of the light zoomed in very well but also out of focus. As I said, the light seem like mini fireworks. I can even see the flashing on my computer light, although very slowly. But when I put my glasses on, the flashes goes away. Are you suggesting it's the brightness that is the main reason. Maybe its a combination of darker light and zooming in.

chemisttree
Homework Helper
Gold Member
I'm suggesting that you may not be able to discern a very bright source flickering as well as you could discern the flicker of a dimmer or spread out image (out of focus).

Don't forget, you could also be a superhero and the glasses are interfering with your power....

In fact I tried putting a pair of sunglasses on and the very same lights didn't seem to be flickering anymore. So intensity plays a part to what I am seeing. But it still seems weird about seeing the light flickering without my glasses. Could it be due to the AC voltage it is receiving? And I don't think I am a super hero. I don't see the power in seeing lights flickering when other people don't.

The most bizarre phenomenon I see is that miniature neon lights (such as appliance pilot lights, power bar illuminated on/off switches, etc.) sometimes seem to flicker on and off randomly when viewed in the dark. Turn the lights on, or view them in daylight, and they change to a steady glow, (except for the 60 Hz flicker which you see if you rapidly turn your eyes sideways).

One of the lights I mentioned which seemed to be flashing was part of a laptop light. However, I recently learned that circuits in laptops are DC so it can't be due to the AC power supply which was causing the flickering. It must be due to the way the light was made. i.e how does a light shine. It shines due to the electrons in the atoms changing orbits from a higher energetic level to a lower one so it wouldn't be too surprising if the light was perceived to be changing (flickering) somewhat.

One of the lights I mentioned which seemed to be flashing was part of a laptop light. However, I recently learned that circuits in laptops are DC so it can't be due to the AC power supply which was causing the flickering.
Well, maybe yes, maybe no.
Many DC circuits with LEDs use a train of DC pulses of varying widths (or duty cycle) generated by a timer chip to control the perceived brightness. My cheap digital clock radio has a bright and low-level setting for the display, and by rapidly "panning" across it in the dark I can tell that the difference is the duty cycle, because it shows up (in the after-image) as a dashed line, with well-defined starts and stops. The frequency of the dashes is the same regardless of the brightness setting, and the brightness is the same, but the length of the dashes changes.

I believe it's because it's easier (from a mass production point of view) to use pulse widths to modulate the brightness rather than attempting to select combinations of LED types (with their variations in manufacturing) and current-limiting resistors (which also have variations in specs) to bias them part-way on. With the pulse-width modulation, they're either on full or not at all, and it reduces it to a problem of timing rather than amount of current.

Oddly enough, I think the frequency in my clock radio LED display is still 60 Hz, but no doubt it's using the unrectified 110V AC as one of the timing references.

Also, there are LEDs that can give red light if the applied voltage is one polarity and green if the opposite polarity. By rapidly alternating the polarity, you can make it appear to be yellow (or intermediate shades of orange or yellow-green) because our eye blends the red and green light.