Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why does a PWM light dimmer actually work?

  1. Oct 30, 2008 #1

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I've heard that one of the applications of PWM is to control the intensity of a light source (e.g. as in a dimmer). My question is, if you send a square wave voltage with a certain duty cycle to a lamp, it seems to me that it would have to be at a high enough frequency that humans wouldn't perceive the flicker. Ok, so if the frequency is that high, then answer me this:

    When the square wave is high, the lamp has the full voltage across it and lights up at maximum intensity. When it's low, the lamp is off. These are the two states of the lamp. If the periods of maximum intensity blur together, then why doesn't the lamp simply appear as though it is on at full intensity, continously (regardless of duty cycle)? Why on earth would the bulb behave as though it were being exposed to a lower voltage that is the *average* of the waveform?
     
  2. jcsd
  3. Oct 31, 2008 #2

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Is it because any circuit has some self inductance and the current through the lamp doesn't jump up to V/R instantaneously but (as we all know) rises like 1 minus an exponential when the supply is turned on? I mean, I don't know what the time constant of a typical light bulb is, but is it "large" enough (relative to the typical PWM square wave period) that the voltage source is not actually "on" long enough for the max current to develop? That would certainly make a whole lot of sense! Shorter duty cycle ==> smaller peak current ==> less power dissipated ==> cooler filament ==> dimmer and redder light. What do you guys think? Have I hit upon the explanation?
     
  4. Oct 31, 2008 #3
    Thats not so much the reasons but does have an affect. Most of the reason is just because thats the way your brain works. I think the average human brain can't identify oscillations past 60hz (freq of CRT monitors). In a nutshell, when a high enough frequency is reached our brains can't identify the light as turning on and off rapidly so it just sort of averages the intensity of light.
     
  5. Nov 2, 2008 #4

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I agree with this...

    ...but I'm just taking this on faith right now. The explanation that your brain "just kinda does the averaging" is tough to justify. After all, how do we know this? However, somebody did post an explanation of sorts in another thread of mine (PWM for Motor Control in the Electrical Engineering subforum) that I'm still chewing on.
     
  6. Nov 2, 2008 #5

    rcgldr

    User Avatar
    Homework Helper

    Assuming it's an incandescent bulb, the rate of heat dissipation is relatively slow compared to the rate of the PWM, and the variation in light output over time is small.
     
  7. Nov 3, 2008 #6
    The light dimmer usually uses a triac, so the light-pulse frequency is 120 hertz. Your TV flickers much slowwer than that.
     
  8. Nov 3, 2008 #7
    Imagine the lamp being off (black, no intensity) for about 99% of the time of one cycle, and give a very short light pulse (full intensity) for 1% of the time of one cycle. The lamp does this at a frequency high enough so that we (humans) can't see the flickering. Do you still think that we would see the lamp as though it was at full intensity all the time?
     
  9. Nov 3, 2008 #8

    rcgldr

    User Avatar
    Homework Helper

    No, especially if there was any movement being observed. When I used to play table tennis, I could see a noticable strobe effect on the ball with fluorescent lighting (the ball would transition between dull and bright, a dull yellow and bright white back in the days of white balls). Motion blur would become animated, such as watching a video using a very fast shutter speed at a normal frame rate. I see CRT monitor flicker at anything less than 80hz, although I'm not sure what the persitance of a typical CRT monitor is.
     
  10. Nov 3, 2008 #9
    Thats a question that I can not answer except that it can be observed rather frequently. Think of it as watching a movie at the theater, or any video for that matter. You don't see the individual frames of the video your brain blends them together to get a uniform and continuous picture. I don't know why the brain does, I just know it does.

    I believe hes referring to something that is powered by DC. Not a incandescent bulb but something more like an LED.
     
  11. Nov 3, 2008 #10
    An incandecsent lamp driven by 60 Hz was the issue. The problem set-up was a square wave voltage. The OP asks about peak intensity perception vs. average intensity preception, apparently. According to the mythology I've been exposed to, the eye percieves peak intensity, not average. Which is it?
     
  12. Nov 3, 2008 #11
    Its sort of both.

    http://www.digchip.com/datasheets/parts/datasheet/000/APP03.php
     
  13. Nov 3, 2008 #12
  14. Nov 5, 2008 #13

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    That's a good point that I failed to take into account. Even if the current in the filament gets cut off, it may not cool down right away by enough to stop glowing visibly.

    You don't need to explain to me how cartoons/movies etc work. I'm aware of this effect. I already acknowledged it in post #4. But not being able to perceive very quick changes and blurring them together into something continuous is not the same thing as averaging them, and we were trying to establish that the latter was also true.

    I didn't really specify it. I left it ambiguous and I apologize for that. When I first posted the problem, I had in my mind an incandescent bulb, because that's what we usually visualize when we think of a light dimmer. However, in my second post, I guess I unwittingly contradicted that somewhat because my descriptions were much more compatible with the DC scenario.

    No. Intuitively, I don't. Clearly, then, *some* sort of averaging effect must be taking place. The biological explanation for it is probably beyond us (and perhaps not even well understood). But the approach that some people have taken, (and that Topher brought our attention to in his last post,) of likening the eye + brain to a photometer and attempting to determine what kind of photometer it is most analogous to...that seems like a good approach. Indeed, as Phrak put it, is it peak perception or average perception?

    Again, good question

    Interesting answer, and thanks for the link!

    Hmmm..seeing how his post now has a quoted paragraph, I'm going to assume that it didn't originally, but then he edited his post and quoted a paragraph as you requested.

    Thank you all for your input. To summarize, I guess we have concluded that the dimming effect is largely a result of the way the eye+brain perceive light, but other effects that would tend to slow the response of the system to rapid changes in the input (i.e. heat dissipation or inductance) may also play a small role.

    If the eye behaved as an integrating photometer, then I guess I can see why the brightness perceived (as averaged over some time period) would be lower for a PWM'd lamp with < 100% duty cycle than it would be for a lamp that's just on all the time. Topher's added wrinkle that the effect is somewhere between integrating and peak perception is intriguing, to say the least.
     
    Last edited: Nov 5, 2008
  15. Nov 6, 2008 #14
    I think you're correct, and thanks for pointing that out! And thank you also Topher. Too bad the graph doesn't show-up.

    The effect of inductance is way down in the mud; it's too small to have any perceptable effect.

    But the dimming from heat disipation is another matter. The heat will leave the filiment. As this happens, its peak blackbody frequency declines in exponential decay. The thermal mass of the thin tungsten filiment is quite small, so the time constant is also small. With some experience in observing a filiment with this question in mind, I would put it in the same order of magnitude as 1/f, where f is the frequency of the line--60 Hz.
     
  16. Nov 6, 2008 #15
    Its actually been there all along, I never edited the post. I guess PF was just flakin out for a moment or something.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Why does a PWM light dimmer actually work?
  1. Why does light move? (Replies: 42)

Loading...