- #1
quozzy
- 15
- 0
This problem just occurred to me while I was taking photos indoors:
Light sources powered by mains fluctuate in intensity due to alternating current. In other words, a lightbulb will actually flicker on and off at 50Hz/60Hz (or, really, 100Hz/120Hz, since I guess it should be the absolute value) in Europe/US respectively. This is not noticeable to humans because it is faster than the refresh rate of our eyes, so we perceive a constant stream of light.
So far, so good.
But what if we take a camera and adjust its shutter speed to something significantly shorter, such as, say, 1/1000 seconds. We should be able to observe the fluctuation in that, in a succession of photos taken indoors, some will be brighter than others. I just tried this a number of times by taking photos of a lightbulb, and there is no visible difference (either in the photo or on the histogram) between any of the photos. Why is that?
Light sources powered by mains fluctuate in intensity due to alternating current. In other words, a lightbulb will actually flicker on and off at 50Hz/60Hz (or, really, 100Hz/120Hz, since I guess it should be the absolute value) in Europe/US respectively. This is not noticeable to humans because it is faster than the refresh rate of our eyes, so we perceive a constant stream of light.
So far, so good.
But what if we take a camera and adjust its shutter speed to something significantly shorter, such as, say, 1/1000 seconds. We should be able to observe the fluctuation in that, in a succession of photos taken indoors, some will be brighter than others. I just tried this a number of times by taking photos of a lightbulb, and there is no visible difference (either in the photo or on the histogram) between any of the photos. Why is that?