Jeff Reid said:
Assuming it's an incandescent bulb, the rate of heat dissipation is relatively slow compared to the rate of the PWM, and the variation in light output over time is small.
That's a good point that I failed to take into account. Even if the current in the filament gets cut off, it may not cool down right away by enough to stop glowing visibly.
Topher925 said:
Thats a question that I can not answer except that it can be observed rather frequently. Think of it as watching a movie at the theater, or any video for that matter. You don't see the individual frames of the video your brain blends them together to get a uniform and continuous picture. I don't know why the brain does, I just know it does.
You don't need to explain to me how cartoons/movies etc work. I'm aware of this effect. I already acknowledged it in post #4. But not being able to perceive very quick changes and blurring them together into something continuous is not the same thing as averaging them, and we were trying to establish that the latter was also true.
Topher925 said:
I believe he's referring to something that is powered by DC. Not a incandescent bulb but something more like an LED.
I didn't really specify it. I left it ambiguous and I apologize for that. When I first posted the problem, I had in my mind an incandescent bulb, because that's what we usually visualize when we think of a light dimmer. However, in my second post, I guess I unwittingly contradicted that somewhat because my descriptions were much more compatible with the DC scenario.
Nick89 said:
Imagine the lamp being off (black, no intensity) for about 99% of the time of one cycle, and give a very short light pulse (full intensity) for 1% of the time of one cycle. The lamp does this at a frequency high enough so that we (humans) can't see the flickering. Do you still think that we would see the lamp as though it was at full intensity all the time?
No. Intuitively, I don't. Clearly, then, *some* sort of averaging effect must be taking place. The biological explanation for it is probably beyond us (and perhaps not even well understood). But the approach that some people have taken, (and that Topher brought our attention to in his last post,) of likening the eye + brain to a photometer and attempting to determine what kind of photometer it is most analogous to...that seems like a good approach. Indeed, as Phrak put it, is it peak perception or average perception?
Phrak said:
An incandecsent lamp driven by 60 Hz was the issue. The problem set-up was a square wave voltage. The OP asks about peak intensity perception vs. average intensity preception, apparently. According to the mythology I've been exposed to, the eye perceives peak intensity, not average. Which is it?
Again, good question
Topher925 said:
Interesting answer, and thanks for the link!
Phrak said:
Thank's Topher. Is there a paragraph you can quote?
Hmmm..seeing how his post now has a quoted paragraph, I'm going to assume that it didn't originally, but then he edited his post and quoted a paragraph as you requested.
Thank you all for your input. To summarize, I guess we have concluded that the dimming effect is largely a result of the way the eye+brain perceive light, but other effects that would tend to slow the response of the system to rapid changes in the input (i.e. heat dissipation or inductance) may also play a small role.
If the eye behaved as an integrating photometer, then I guess I can see why the brightness perceived (as averaged over some time period) would be lower for a PWM'd lamp with < 100% duty cycle than it would be for a lamp that's just on all the time. Topher's added wrinkle that the effect is somewhere between integrating and peak perception is intriguing, to say the least.