Why does a light blinking quickly average out as 'on' to us?

  • Thread starter Thread starter Steve Drake
  • Start date Start date
  • Tags Tags
    Average Light
AI Thread Summary
A light blinking rapidly can appear continuously 'on' to human observers due to the persistence of vision, where the optic nerves take time to fade out the light perception. This effect occurs when the blink frequency exceeds approximately 15 frames per second, causing the light to average out as 'on' despite being off for half the time. The perceived brightness is proportional to the ratio of on/off time, meaning that even at lower brightness levels, as long as some light is emitted, it will still appear 'on'. Factors such as brightness, contrast, and duration also influence visual perception thresholds. Ultimately, the brain's processing of light and motion contributes to this phenomenon, making rapid flickering lights seem continuously illuminated.
Steve Drake
Messages
53
Reaction score
1
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?
 
Science news on Phys.org
Let's say that the light is on half the time and off the other half. If we increase the frequency up past the point where our eyes can no longer tell the light is blinking, then the light will look half as bright as when it is always on. Since a light that is half as bright as normal is still "on" to us, the light looks on.

Pretty much it boils down to "off" being the state when the light isn't emitting any light. So as long as at least some light is being emitted, the light will look on.
 
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

That certain point is around 1/15 of a second, or 15 frames per second. When optic nerves get activated it takes that much, or a bit more time, for this "impulse" to fade out, and so the brightness persist if light-on intervals are shorter than "fade out" interval.
 
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

It doesn't average to fully ON.

It averages to half brightness (or a brightness that depends on the ratio ON:OFF time). For example if it was ON for 25mS then OFF for 75mS it would appear roughly a quarter as bright as it would if ON all the time.

This is the way many light dimmers work.
 
carrz said:
That certain point is around 1/15 of a second, or 15 frames per second. When optic nerves get activated it takes that much, or a bit more time, for this "impulse" to fade out, and so the brightness persist if light-on intervals are shorter than "fade out" interval.
Many people, including me, can see 60hz flicker on a CRT monitor because the phosphors are designed for higher frequencies and fade too much at 60hz. Some people may not notice the flicker up close, but will notice it at some distance, like 3 meters or so, from the monitor. I still have a pair of CRT monitors and run both at the recommended 85hz frame refresh rate.

IBM 3270 type CRT monitors (mostly green on black background) have (had) very low fade rate (very high persistance factor) and could run at slower refresh rates. There terminals are mostly used with "protected" fixed text fields on a display where the user keys data into the "unprotected" blank fields on the screen. The persistance was about a second, so a moving cursor left a fairly long "trail", and anything that resembled a scroll like operation took almost a second to clear up. However the normal usage was for a fixed field mode, and with the high persistance, a very thin beam is used and the text is very sharp.

Watching a film with a frame frequency around 1/18th or faster isn't as noticable, because the period of darkness between frames is very short (the time to close shutter, advance frame, open shutter, is very fast compared to the frame rate).
 
Last edited:
rcgldr said:
Watching a film with a frame frequency around 1/18th or faster isn't as noticable, because the period of darkness between frames is very short (the time to close shutter, advance frame, open shutter, is very fast compared to the frame rate).
On many / most movie projectors, the light is interrupted twice as fast as the film frames go through. This doubles the flicker rate, which makes the flicker much less visible. It doesn't help with the jerkiness of the movement portrayal, though. Ah, the joy of watching good frame rate up conversion (over a hundred Hz frame rate) since they started giving us digital TV.
 
rcgldr said:
Many people, including me, can see 60hz flicker on a CRT monitor because the phosphors are designed for higher frequencies and fade too much at 60hz. Some people may not notice the flicker up close, but will notice it at some distance, like 3 meters or so, from the monitor. I still have a pair of CRT monitors and run both at the recommended 85hz frame refresh rate.

I guess the reason for that is either quicker fade out interval of "eye/brain pixels", or higher sensitivity to distinguish more shades of brightness, or little bit of both.
 
One good reason for the receptors having such a quick response time is that we actually need to perceive motion. A very 'laggy' response would lead to a blurring of movement (smearing) like the old CCTV pictures, which were produced from the cheap old Vidicon camera tubes. That 'annoying' flicker that we used to see with an old CRT TV in our peripheral vision means that we are aware of small rapid movements on the edge of the scene. That could have been a leopard, ready to spring on us. We really weren't evolved for TV watching!
But there is, as always, an engineering compromise here. Having a longer response time would mean that we could be more sensitive in low light (and, incidentally we'd enjoy TV more).
 
Steve Drake said:
Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we won't be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

But why does it average out as 'on' to us and not 'off'?

It's an interesting question- visual thresholds depend on the source brightness, contrast, size, color, and also duration- a small low-contrast object that is 'on' for a short period of time may not be seen. A nice overview of dynamic thresholds is here:

http://webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/

And here:
http://www.ski.org/CWTyler_lab/CWTyler/TylerPDFs/HamerTylerIV-JOSA1990.pdf
 
Last edited by a moderator:
  • #10
But why does it average out as 'on' to us and not 'off'?

If it's a low level then it will average out to 'off', even if the peak brightness is quite high. As it happens, such a low (marginal) level would be more visible on the periphery of vision.
 
  • #11
It it simpler to understand if you just consider the amount of photons hitting your eyeball? More photons = more light = brighter. Less = dimmer. Or I could be oversimplifying.
 
  • #12
dkotschessaa said:
It it simpler to understand if you just consider the amount of photons hitting your eyeball? More photons = more light = brighter. Less = dimmer. Or I could be oversimplifying.

That's true but it does not address the time interval where instead of blinking we start to see continuous light. Optic nerves have "fade out" interval, the more photons hit the longer it takes for this perceived brightness to fade out. For some average TV brightness this fade out interval is around 1/15 of a second. That's why 24 frames per second is generally sufficient to perceive "smooth animation".

But if you take a quick look at the Sun, brightness perception would persist for likely more than a second even with your eyes closed. In other words, if the Sun was blinking on and off at one second intervals we might not be able to notice it with the naked eyes except that it would appear half as bright, roughly speaking. It's more complex than that because beside the brightness impulse in optic nerves there is also the brain which is interpreting it all, and the brain has its own operation frequency and possibly other time related limits influencing actual perception and the final "movie" appearance experienced by the mind.
 
Back
Top