- #1
AFG
- 2
- 0
Can you help me figure out how this light-based phenomenon works? Please explain with an intuitive answer using the simplest principles possible.
Note how the location and width of the dark horizontal bands is constant with respect to the edge of the video frame.
The LED light looks normal enough to the human eye, but this effect is viewable through a phone camera and the effect was reproduced on another person's phone camera as well. I can think of 3 things that might contribute to this effect.
1. LEDs are usually dimmed by cycling the LEDs on and off quickly so that the human eye sees something like the average light intensity being emitted. The cycling frequency seemed very high, or at least smooth- I didn't see any strobe effects through my eyes when I shook my head from side to side. If the cycle frequency was low I would have seen some bright afterimage trails "burned" into my eyes.
2. I imagine that your average phone camera acquires frames at a certain frequency. This and perhaps the physics/protocol of light sensing might have an effect.
3. The software post-processing of the video could contribute to this effect. This post-processing effect would have to be common across different phones for this to be the case though.
Thank for any input!
Note how the location and width of the dark horizontal bands is constant with respect to the edge of the video frame.
The LED light looks normal enough to the human eye, but this effect is viewable through a phone camera and the effect was reproduced on another person's phone camera as well. I can think of 3 things that might contribute to this effect.
1. LEDs are usually dimmed by cycling the LEDs on and off quickly so that the human eye sees something like the average light intensity being emitted. The cycling frequency seemed very high, or at least smooth- I didn't see any strobe effects through my eyes when I shook my head from side to side. If the cycle frequency was low I would have seen some bright afterimage trails "burned" into my eyes.
2. I imagine that your average phone camera acquires frames at a certain frequency. This and perhaps the physics/protocol of light sensing might have an effect.
3. The software post-processing of the video could contribute to this effect. This post-processing effect would have to be common across different phones for this to be the case though.
Thank for any input!