Why does a PWM light dimmer actually work?

In summary, PWM can be used to control the intensity of a light source by sending a square wave voltage with a certain duty cycle. The frequency of the square wave must be high enough for humans not to perceive flickering. The lamp will appear to be at full intensity when the square wave is high and off when it is low. The brain averages the light pulses at high frequencies, resulting in a constant perceived intensity. The brain also partially integrates and partially takes peak readings, resulting in an increase in brightness with pulsing.
  • #1
cepheid
Staff Emeritus
Science Advisor
Gold Member
5,199
38
I've heard that one of the applications of PWM is to control the intensity of a light source (e.g. as in a dimmer). My question is, if you send a square wave voltage with a certain duty cycle to a lamp, it seems to me that it would have to be at a high enough frequency that humans wouldn't perceive the flicker. Ok, so if the frequency is that high, then answer me this:

When the square wave is high, the lamp has the full voltage across it and lights up at maximum intensity. When it's low, the lamp is off. These are the two states of the lamp. If the periods of maximum intensity blur together, then why doesn't the lamp simply appear as though it is on at full intensity, continously (regardless of duty cycle)? Why on Earth would the bulb behave as though it were being exposed to a lower voltage that is the *average* of the waveform?
 
Science news on Phys.org
  • #2
Is it because any circuit has some self inductance and the current through the lamp doesn't jump up to V/R instantaneously but (as we all know) rises like 1 minus an exponential when the supply is turned on? I mean, I don't know what the time constant of a typical light bulb is, but is it "large" enough (relative to the typical PWM square wave period) that the voltage source is not actually "on" long enough for the max current to develop? That would certainly make a whole lot of sense! Shorter duty cycle ==> smaller peak current ==> less power dissipated ==> cooler filament ==> dimmer and redder light. What do you guys think? Have I hit upon the explanation?
 
  • #3
Is it because any circuit has some self inductance

Thats not so much the reasons but does have an affect. Most of the reason is just because that's the way your brain works. I think the average human brain can't identify oscillations past 60hz (freq of CRT monitors). In a nutshell, when a high enough frequency is reached our brains can't identify the light as turning on and off rapidly so it just sort of averages the intensity of light.
 
  • #4
Topher925 said:
In a nutshell, when a high enough frequency is reached our brains can't identify the light as turning on and off rapidly

I agree with this...

Topher925 said:
so it just sort of averages the intensity of light.

...but I'm just taking this on faith right now. The explanation that your brain "just kinda does the averaging" is tough to justify. After all, how do we know this? However, somebody did post an explanation of sorts in another thread of mine (PWM for Motor Control in the Electrical Engineering subforum) that I'm still chewing on.
 
  • #5
Assuming it's an incandescent bulb, the rate of heat dissipation is relatively slow compared to the rate of the PWM, and the variation in light output over time is small.
 
  • #6
The light dimmer usually uses a triac, so the light-pulse frequency is 120 hertz. Your TV flickers much slowwer than that.
 
  • #7
Imagine the lamp being off (black, no intensity) for about 99% of the time of one cycle, and give a very short light pulse (full intensity) for 1% of the time of one cycle. The lamp does this at a frequency high enough so that we (humans) can't see the flickering. Do you still think that we would see the lamp as though it was at full intensity all the time?
 
  • #8
Nick89 said:
Imagine the lamp being off (black, no intensity) for about 99% of the time of one cycle, and give a very short light pulse (full intensity) for 1% of the time of one cycle. The lamp does this at a frequency high enough so that we (humans) can't see the flickering. Do you still think that we would see the lamp as though it was at full intensity all the time?
No, especially if there was any movement being observed. When I used to play table tennis, I could see a noticable strobe effect on the ball with fluorescent lighting (the ball would transition between dull and bright, a dull yellow and bright white back in the days of white balls). Motion blur would become animated, such as watching a video using a very fast shutter speed at a normal frame rate. I see CRT monitor flicker at anything less than 80hz, although I'm not sure what the persitance of a typical CRT monitor is.
 
  • #9
The explanation that your brain "just kinda does the averaging" is tough to justify. After all, how do we know this?

Thats a question that I can not answer except that it can be observed rather frequently. Think of it as watching a movie at the theater, or any video for that matter. You don't see the individual frames of the video your brain blends them together to get a uniform and continuous picture. I don't know why the brain does, I just know it does.

Phrak said:
The light dimmer usually uses a triac, so the light-pulse frequency is 120 hertz. Your TV flickers much slowwer than that.

I believe he's referring to something that is powered by DC. Not a incandescent bulb but something more like an LED.
 
  • #10
Topher925 said:
I believe he's referring to something that is powered by DC. Not a incandescent bulb but something more like an LED.

An incandecsent lamp driven by 60 Hz was the issue. The problem set-up was a square wave voltage. The OP asks about peak intensity perception vs. average intensity preception, apparently. According to the mythology I've been exposed to, the eye perceives peak intensity, not average. Which is it?
 
  • #11
Phrak said:
The OP asks about peak intensity perception vs. average intensity preception, apparently. According to the mythology I've been exposed to, the eye perceives peak intensity, not average. Which is it?

Its sort of both.

as an increase in luminance, or luminous intensity due to pulsing, there is an increase in brightness because of the behavior of the eye. The eye does not behave as an integrating photometer, but as a partially integrating and partially peak reading photometer. As a result, the eye perceives a brightness that is somewhere between the peak and the average brightness

http://www.digchip.com/datasheets/parts/datasheet/000/APP03.php
 
  • #13
Jeff Reid said:
Assuming it's an incandescent bulb, the rate of heat dissipation is relatively slow compared to the rate of the PWM, and the variation in light output over time is small.

That's a good point that I failed to take into account. Even if the current in the filament gets cut off, it may not cool down right away by enough to stop glowing visibly.

Topher925 said:
Thats a question that I can not answer except that it can be observed rather frequently. Think of it as watching a movie at the theater, or any video for that matter. You don't see the individual frames of the video your brain blends them together to get a uniform and continuous picture. I don't know why the brain does, I just know it does.

You don't need to explain to me how cartoons/movies etc work. I'm aware of this effect. I already acknowledged it in post #4. But not being able to perceive very quick changes and blurring them together into something continuous is not the same thing as averaging them, and we were trying to establish that the latter was also true.

Topher925 said:
I believe he's referring to something that is powered by DC. Not a incandescent bulb but something more like an LED.

I didn't really specify it. I left it ambiguous and I apologize for that. When I first posted the problem, I had in my mind an incandescent bulb, because that's what we usually visualize when we think of a light dimmer. However, in my second post, I guess I unwittingly contradicted that somewhat because my descriptions were much more compatible with the DC scenario.

Nick89 said:
Imagine the lamp being off (black, no intensity) for about 99% of the time of one cycle, and give a very short light pulse (full intensity) for 1% of the time of one cycle. The lamp does this at a frequency high enough so that we (humans) can't see the flickering. Do you still think that we would see the lamp as though it was at full intensity all the time?

No. Intuitively, I don't. Clearly, then, *some* sort of averaging effect must be taking place. The biological explanation for it is probably beyond us (and perhaps not even well understood). But the approach that some people have taken, (and that Topher brought our attention to in his last post,) of likening the eye + brain to a photometer and attempting to determine what kind of photometer it is most analogous to...that seems like a good approach. Indeed, as Phrak put it, is it peak perception or average perception?

Phrak said:
An incandecsent lamp driven by 60 Hz was the issue. The problem set-up was a square wave voltage. The OP asks about peak intensity perception vs. average intensity preception, apparently. According to the mythology I've been exposed to, the eye perceives peak intensity, not average. Which is it?

Again, good question

Topher925 said:

Interesting answer, and thanks for the link!

Phrak said:
Thank's Topher. Is there a paragraph you can quote?

Hmmm..seeing how his post now has a quoted paragraph, I'm going to assume that it didn't originally, but then he edited his post and quoted a paragraph as you requested.

Thank you all for your input. To summarize, I guess we have concluded that the dimming effect is largely a result of the way the eye+brain perceive light, but other effects that would tend to slow the response of the system to rapid changes in the input (i.e. heat dissipation or inductance) may also play a small role.

If the eye behaved as an integrating photometer, then I guess I can see why the brightness perceived (as averaged over some time period) would be lower for a PWM'd lamp with < 100% duty cycle than it would be for a lamp that's just on all the time. Topher's added wrinkle that the effect is somewhere between integrating and peak perception is intriguing, to say the least.
 
Last edited:
  • #14
cepheid said:
Hmmm..seeing how his post now has a quoted paragraph, I'm going to assume that it didn't originally, but then he edited his post and quoted a paragraph as you requested.

I think you're correct, and thanks for pointing that out! And thank you also Topher. Too bad the graph doesn't show-up.

Thank you all for your input. To summarize, I guess we have concluded that the dimming effect is largely a result of the way the eye+brain perceive light, but other effects that would tend to slow the response of the system to rapid changes in the input (i.e. heat dissipation or inductance) may also play a small role.

The effect of inductance is way down in the mud; it's too small to have any perceptable effect.

But the dimming from heat disipation is another matter. The heat will leave the filiment. As this happens, its peak blackbody frequency declines in exponential decay. The thermal mass of the thin tungsten filiment is quite small, so the time constant is also small. With some experience in observing a filiment with this question in mind, I would put it in the same order of magnitude as 1/f, where f is the frequency of the line--60 Hz.
 
  • #15
Hmmm..seeing how his post now has a quoted paragraph, I'm going to assume that it didn't originally, but then he edited his post and quoted a paragraph as you requested.

Its actually been there all along, I never edited the post. I guess PF was just flakin out for a moment or something.
 

1. How does a PWM light dimmer work?

A PWM (Pulse Width Modulation) light dimmer works by rapidly turning the light on and off at a specific frequency. The amount of time the light is on versus off determines the overall brightness of the light. By adjusting the length of the on and off periods, the dimmer can control the amount of power being sent to the light, thus dimming or brightening it.

2. What is the advantage of using a PWM light dimmer?

One of the main advantages of a PWM light dimmer is that it is much more energy-efficient compared to traditional rheostat or resistive-based dimmers. This is because the light is not being supplied with a constant amount of power that is being wasted as heat, but instead only receives the necessary amount of power to maintain the desired brightness.

3. How does a PWM light dimmer differ from other types of dimmers?

A PWM light dimmer differs from other types of dimmers in that it uses digital signals to control the power supply, whereas traditional dimmers use analog signals. This allows for more precise and efficient control over the brightness levels, as well as the ability to use remote controls or smart home systems to adjust the dimmer.

4. Can a PWM light dimmer work with any type of light bulb?

Yes, a PWM light dimmer can work with any type of light bulb, as long as the dimmer is compatible with the type of light bulb being used. This means that the dimmer needs to be able to handle the voltage and wattage of the light bulb in order to function properly.

5. Are there any potential drawbacks to using a PWM light dimmer?

One potential drawback of using a PWM light dimmer is that it can cause some types of light bulbs, such as incandescent or halogen bulbs, to emit a buzzing or humming noise. This is due to the rapid switching on and off of the light, and can be avoided by using dimmable LED or CFL bulbs instead.

Similar threads

Replies
3
Views
3K
  • Electrical Engineering
Replies
21
Views
1K
  • Electrical Engineering
Replies
12
Views
1K
  • Electrical Engineering
Replies
10
Views
844
Replies
52
Views
4K
Replies
7
Views
1K
  • Optics
Replies
12
Views
2K
  • Electromagnetism
Replies
9
Views
1K
Replies
5
Views
2K
  • Classical Physics
2
Replies
57
Views
7K
Back
Top