Why does a light blinking quickly average out as 'on' to us?

In summary, at a certain point, a light blinking at a certain speed will appear as 'on' to humans because our eyes cannot register the off state. This point is around 1/15 of a second, or 15 frames per second. The IBM 3270 type CRT monitors have a high persistence factor, making the text appear sharp even with a slow frame rate. Watching a film with a frame frequency of 1/18th or faster is less noticeable due to the quick fade out interval of eye/brain pixels. Our receptors have a quick response time in order to perceive motion without blurring.
  • #1
Steve Drake
53
1
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?
 
Science news on Phys.org
  • #2
Let's say that the light is on half the time and off the other half. If we increase the frequency up past the point where our eyes can no longer tell the light is blinking, then the light will look half as bright as when it is always on. Since a light that is half as bright as normal is still "on" to us, the light looks on.

Pretty much it boils down to "off" being the state when the light isn't emitting any light. So as long as at least some light is being emitted, the light will look on.
 
  • #3
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

That certain point is around 1/15 of a second, or 15 frames per second. When optic nerves get activated it takes that much, or a bit more time, for this "impulse" to fade out, and so the brightness persist if light-on intervals are shorter than "fade out" interval.
 
  • #4
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

It doesn't average to fully ON.

It averages to half brightness (or a brightness that depends on the ratio ON:OFF time). For example if it was ON for 25mS then OFF for 75mS it would appear roughly a quarter as bright as it would if ON all the time.

This is the way many light dimmers work.
 
  • #5
carrz said:
That certain point is around 1/15 of a second, or 15 frames per second. When optic nerves get activated it takes that much, or a bit more time, for this "impulse" to fade out, and so the brightness persist if light-on intervals are shorter than "fade out" interval.
Many people, including me, can see 60hz flicker on a CRT monitor because the phosphors are designed for higher frequencies and fade too much at 60hz. Some people may not notice the flicker up close, but will notice it at some distance, like 3 meters or so, from the monitor. I still have a pair of CRT monitors and run both at the recommended 85hz frame refresh rate.

IBM 3270 type CRT monitors (mostly green on black background) have (had) very low fade rate (very high persistance factor) and could run at slower refresh rates. There terminals are mostly used with "protected" fixed text fields on a display where the user keys data into the "unprotected" blank fields on the screen. The persistance was about a second, so a moving cursor left a fairly long "trail", and anything that resembled a scroll like operation took almost a second to clear up. However the normal usage was for a fixed field mode, and with the high persistance, a very thin beam is used and the text is very sharp.

Watching a film with a frame frequency around 1/18th or faster isn't as noticable, because the period of darkness between frames is very short (the time to close shutter, advance frame, open shutter, is very fast compared to the frame rate).
 
Last edited:
  • #6
rcgldr said:
Watching a film with a frame frequency around 1/18th or faster isn't as noticable, because the period of darkness between frames is very short (the time to close shutter, advance frame, open shutter, is very fast compared to the frame rate).
On many / most movie projectors, the light is interrupted twice as fast as the film frames go through. This doubles the flicker rate, which makes the flicker much less visible. It doesn't help with the jerkiness of the movement portrayal, though. Ah, the joy of watching good frame rate up conversion (over a hundred Hz frame rate) since they started giving us digital TV.
 
  • #7
rcgldr said:
Many people, including me, can see 60hz flicker on a CRT monitor because the phosphors are designed for higher frequencies and fade too much at 60hz. Some people may not notice the flicker up close, but will notice it at some distance, like 3 meters or so, from the monitor. I still have a pair of CRT monitors and run both at the recommended 85hz frame refresh rate.

I guess the reason for that is either quicker fade out interval of "eye/brain pixels", or higher sensitivity to distinguish more shades of brightness, or little bit of both.
 
  • #8
One good reason for the receptors having such a quick response time is that we actually need to perceive motion. A very 'laggy' response would lead to a blurring of movement (smearing) like the old CCTV pictures, which were produced from the cheap old Vidicon camera tubes. That 'annoying' flicker that we used to see with an old CRT TV in our peripheral vision means that we are aware of small rapid movements on the edge of the scene. That could have been a leopard, ready to spring on us. We really weren't evolved for TV watching!
But there is, as always, an engineering compromise here. Having a longer response time would mean that we could be more sensitive in low light (and, incidentally we'd enjoy TV more).
 
  • #9
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

It's an interesting question- visual thresholds depend on the source brightness, contrast, size, color, and also duration- a small low-contrast object that is 'on' for a short period of time may not be seen. A nice overview of dynamic thresholds is here:

http://webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/

And here:
http://www.ski.org/CWTyler_lab/CWTyler/TylerPDFs/HamerTylerIV-JOSA1990.pdf
 
Last edited by a moderator:
  • #10
But why does it average out as 'on' to us and not 'off'?

If it's a low level then it will average out to 'off', even if the peak brightness is quite high. As it happens, such a low (marginal) level would be more visible on the periphery of vision.
 
  • #11
It it simpler to understand if you just consider the amount of photons hitting your eyeball? More photons = more light = brighter. Less = dimmer. Or I could be oversimplifying.
 
  • #12
dkotschessaa said:
It it simpler to understand if you just consider the amount of photons hitting your eyeball? More photons = more light = brighter. Less = dimmer. Or I could be oversimplifying.

That's true but it does not address the time interval where instead of blinking we start to see continuous light. Optic nerves have "fade out" interval, the more photons hit the longer it takes for this perceived brightness to fade out. For some average TV brightness this fade out interval is around 1/15 of a second. That's why 24 frames per second is generally sufficient to perceive "smooth animation".

But if you take a quick look at the Sun, brightness perception would persist for likely more than a second even with your eyes closed. In other words, if the Sun was blinking on and off at one second intervals we might not be able to notice it with the naked eyes except that it would appear half as bright, roughly speaking. It's more complex than that because beside the brightness impulse in optic nerves there is also the brain which is interpreting it all, and the brain has its own operation frequency and possibly other time related limits influencing actual perception and the final "movie" appearance experienced by the mind.
 

1. Why does a light blinking quickly appear to be constantly on to us?

The phenomenon of a quickly blinking light appearing to be constantly on is due to the persistence of vision. This is the tendency of the human eye to retain an image for a fraction of a second after it has disappeared from view. When a light blinks quickly, the retina is still processing the light even when it is off, creating the illusion of a constant light.

2. How does our brain interpret a quickly blinking light as being "on"?

Our brain interprets a quickly blinking light as being "on" because it receives a continuous stream of information from the retina, even during the periods when the light is off. This information is then processed and interpreted as a constant light, due to the persistence of vision.

3. Does the frequency of the blinking light affect how we perceive it?

Yes, the frequency of the blinking light does affect how we perceive it. A light blinking at a higher frequency will appear to be more constant to us, while a light blinking at a lower frequency may appear to be flickering or even off. This is because the higher frequency allows for a shorter gap between blinks, creating a smoother and more consistent image for our brain to interpret.

4. Can everyone perceive a quickly blinking light as "on"?

No, not everyone can perceive a quickly blinking light as "on." This ability is influenced by individual factors such as age, visual acuity, and retinal sensitivity. Some people may not be able to perceive the persistence of vision and therefore may see a blinking light as flickering or off.

5. What other factors can affect the perception of a quickly blinking light as "on"?

Aside from individual factors, environmental factors such as ambient light and distance from the light source can also affect the perception of a quickly blinking light as "on." In a well-lit room, the persistence of vision may not be as noticeable, while in a dark room the illusion may be more prominent. Additionally, the distance from the light source can affect the intensity of the light received by the retina, which can also impact the perception of the blinking light.

Similar threads

  • Optics
Replies
12
Views
2K
Replies
4
Views
4K
Replies
5
Views
944
  • Introductory Physics Homework Help
2
Replies
35
Views
1K
  • Special and General Relativity
Replies
25
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
1K
Replies
130
Views
8K
Replies
14
Views
3K
Back
Top