1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why does a light blinking quickly average out as 'on' to us?

  1. Jul 23, 2014 #1
    Consider a light, an L.E.D for example, turning on and off once per second. For humans, we will look at it and think "clearly on, clearly off, clearly on, clearly off" for each 'state'.

    Our view of whether it is on or off will continue like this if its 'blink frequency' is increased up to a certain point. Once it is blinking at a certain speed, we wont be able to tell if it is turning off at all and will seem 'on' to us 100% of the time.

    But why does it average out as 'on' to us and not 'off'?
     
  2. jcsd
  3. Jul 23, 2014 #2

    Drakkith

    User Avatar

    Staff: Mentor

    Let's say that the light is on half the time and off the other half. If we increase the frequency up past the point where our eyes can no longer tell the light is blinking, then the light will look half as bright as when it is always on. Since a light that is half as bright as normal is still "on" to us, the light looks on.

    Pretty much it boils down to "off" being the state when the light isn't emitting any light. So as long as at least some light is being emitted, the light will look on.
     
  4. Jul 23, 2014 #3
    That certain point is around 1/15 of a second, or 15 frames per second. When optic nerves get activated it takes that much, or a bit more time, for this "impulse" to fade out, and so the brightness persist if light-on intervals are shorter than "fade out" interval.
     
  5. Jul 23, 2014 #4

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    It doesn't average to fully ON.

    It averages to half brightness (or a brightness that depends on the ratio ON:OFF time). For example if it was ON for 25mS then OFF for 75mS it would appear roughly a quarter as bright as it would if ON all the time.

    This is the way many light dimmers work.
     
  6. Jul 23, 2014 #5

    rcgldr

    User Avatar
    Homework Helper

    Many people, including me, can see 60hz flicker on a CRT monitor because the phosphors are designed for higher frequencies and fade too much at 60hz. Some people may not notice the flicker up close, but will notice it at some distance, like 3 meters or so, from the monitor. I still have a pair of CRT monitors and run both at the recommended 85hz frame refresh rate.

    IBM 3270 type CRT monitors (mostly green on black background) have (had) very low fade rate (very high persistance factor) and could run at slower refresh rates. There terminals are mostly used with "protected" fixed text fields on a display where the user keys data into the "unprotected" blank fields on the screen. The persistance was about a second, so a moving cursor left a fairly long "trail", and anything that resembled a scroll like operation took almost a second to clear up. However the normal usage was for a fixed field mode, and with the high persistance, a very thin beam is used and the text is very sharp.

    Watching a film with a frame frequency around 1/18th or faster isn't as noticable, because the period of darkness between frames is very short (the time to close shutter, advance frame, open shutter, is very fast compared to the frame rate).
     
    Last edited: Jul 23, 2014
  7. Jul 25, 2014 #6

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    On many / most movie projectors, the light is interrupted twice as fast as the film frames go through. This doubles the flicker rate, which makes the flicker much less visible. It doesn't help with the jerkiness of the movement portrayal, though. Ah, the joy of watching good frame rate up conversion (over a hundred Hz frame rate) since they started giving us digital TV.
     
  8. Jul 25, 2014 #7
    I guess the reason for that is either quicker fade out interval of "eye/brain pixels", or higher sensitivity to distinguish more shades of brightness, or little bit of both.
     
  9. Jul 26, 2014 #8

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    One good reason for the receptors having such a quick response time is that we actually need to perceive motion. A very 'laggy' response would lead to a blurring of movement (smearing) like the old CCTV pictures, which were produced from the cheap old Vidicon camera tubes. That 'annoying' flicker that we used to see with an old CRT TV in our peripheral vision means that we are aware of small rapid movements on the edge of the scene. That could have been a leopard, ready to spring on us. We really weren't evolved for TV watching!
    But there is, as always, an engineering compromise here. Having a longer response time would mean that we could be more sensitive in low light (and, incidentally we'd enjoy TV more).
     
  10. Jul 30, 2014 #9

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    It's an interesting question- visual thresholds depend on the source brightness, contrast, size, color, and also duration- a small low-contrast object that is 'on' for a short period of time may not be seen. A nice overview of dynamic thresholds is here:

    http://webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/

    And here:
    http://www.ski.org/CWTyler_lab/CWTyler/TylerPDFs/HamerTylerIV-JOSA1990.pdf [Broken]
     
    Last edited by a moderator: May 6, 2017
  11. Jul 30, 2014 #10

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    If it's a low level then it will average out to 'off', even if the peak brightness is quite high. As it happens, such a low (marginal) level would be more visible on the periphery of vision.
     
  12. Jul 30, 2014 #11
    It it simpler to understand if you just consider the amount of photons hitting your eyeball? More photons = more light = brighter. Less = dimmer. Or I could be oversimplifying.
     
  13. Jul 30, 2014 #12
    That's true but it does not address the time interval where instead of blinking we start to see continuous light. Optic nerves have "fade out" interval, the more photons hit the longer it takes for this perceived brightness to fade out. For some average TV brightness this fade out interval is around 1/15 of a second. That's why 24 frames per second is generally sufficient to perceive "smooth animation".

    But if you take a quick look at the Sun, brightness perception would persist for likely more than a second even with your eyes closed. In other words, if the Sun was blinking on and off at one second intervals we might not be able to notice it with the naked eyes except that it would appear half as bright, roughly speaking. It's more complex than that because beside the brightness impulse in optic nerves there is also the brain which is interpreting it all, and the brain has its own operation frequency and possibly other time related limits influencing actual perception and the final "movie" appearance experienced by the mind.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Why does a light blinking quickly average out as 'on' to us?
Loading...