# Understanding the effect of duty cycle on the brightness of LEDs

## Summary:

If I have understood correctly, the duty cycle of a pulse signal can be used to control the brightness of an LED. I am unsure as to why this happens.
The brightness of an LED is a result of the current passing through it. The current is a function of the voltage across the LED.

The duty cycle represents the percentage of the time period that the signal is in the ON state ( which means that the voltage is at max value as compared to 0 V for the rest of the time period).

If I have understood correctly, it seems that the duty cycle can be increased to make the LED brighter?!
I don't see why this should happen given that at the end of the day the voltage across the LED is constant irrespective of the duty cycle.

The explanation seems to be that the effective voltage is what causes the LED to change brightness (Effective/ Average Voltage = Duty Cycle * Maximum Voltage). However I am unable to see why this should be so given that actually the max voltage is always being applied in reality.

Related Electrical Engineering News on Phys.org
Averagesupernova
Gold Member
Why do you think max voltage is always applied? Duty cycle implies there is a percentage of time there is NO voltage across the LED. BTW, the current through and the voltage across an LED do NOT have a linear relationship as you seem to imply.

Borek
Mentor
Two ways of dimming the LED. One is to lower the voltage, then the current goes down and LED gets dimmer (it is in no way related to the duty cycle, and as @Averagesupernova mentioned, this wont be linear). Second is to always use the same voltage (current will follow) but to flicker the LED (duty cycle, as far as I am aware typically implemented with PWM) on/off so fast you can't see it. Net effect is the light gets dimmer.

JC2000
DaveE
Gold Member
First, current as a function of applied voltage is not a very stable function in diodes (because of temperature). Also the light emitted is really related to the charge carriers that cross the junction (i.e. current) more than the voltage. So, it's not really your question, but it is much better, IMO, to think of the current applied through the diode than the voltage. In fact this is how good circuit designers will power an LED. In practice, it is incredibly common for the LED manufacturer to include a series resistor in the package to help do this for you. If you ever see an LED that is advertised with a voltage like 5V, 12V, etc. then you can be sure that they have done this.

The "brightness" you speak of is integrated by the response of your visual system; i.e. total photons collected over some time period (maybe 20msec, or so). The PWM method of controlling brightness relies on this, it will operate at frequencies high enough that you don't notice the on/off flicker. So, you are correct, the instantaneous brightness of the LED is the same when its on. But when averaged with the on/off duty cycle, it will appear to change in brightness.

The PWM approach is easier to implement because it only involves switching, instead of analog control of the LED current. Varying the LED voltage is an even worse method of brightness control, since the I-V relationship is very non-linear and varies with temperature.

JC2000
Why do you think max voltage is always applied? Duty cycle implies there is a percentage of time there is NO voltage across the LED.
I think only max voltage can be applied since the voltage source can only produce that voltage. The rest of the time there should be no voltage. The only way I can come to terms with the average voltage being produced is to think in terms of power wherein the power is the same if you consider the max voltage for a percentage of the time period or the average for the entire time period (?).

Just realised that a diode won't have linear characteristics!

Averagesupernova
Gold Member
Average voltage is peak * duty cycle. Forget power for this.

JC2000
Two ways of dimming the LED. One is to lower the voltage, then the current goes down and LED gets dimmer (it is in no way related to the duty cycle, and as @Averagesupernova mentioned, this wont be linear). Second is to always use the same voltage (current will follow) but to flicker the LED (duty cycle, as far as I am aware typically implemented with PWM) on/off so fast you can't see it. Net effect is the light gets dimmer.
Yes I remember now that the voltage current relationship is not linear for diodes!

Regarding the second method of dimming the LED ...
Why does flickering affect the brightness?

Average voltage is peak * duty cycle. Forget power for this.
I know that if the LED flickers fast enough then it seems like it's on all the time. I am not able to use this information to explain why the brightness seems to change due to this.

Averagesupernova
Gold Member
I know that if the LED flickers fast enough then it seems like it's on all the time. I am not able to use this information to explain why the brightness seems to change due to this.
I cannot see why. Average current is usually what we are worried about with LEDs as @DaveE has eluded to. But, if you can understand one, why not the other? When you alternately dump cold and hot water into a bucket can you see how the average forms a warm temp? I know people on here hate analogies, flame away. Lol.

JC2000 and russ_watters
The "brightness" you speak of is integrated by the response of your visual system; i.e. total photons collected over some time period (maybe 20msec, or so). The PWM method of controlling brightness relies on this, it will operate at frequencies high enough that you don't notice the on/off flicker. So, you are correct, the instantaneous brightness of the LED is the same when its on. But when averaged with the on/off duty cycle, it will appear to change in brightness.
Since intensity is the number of photons per unit area at a given time.
Would it be correct to conclude that in reality the voltage applied across the LED is always the max voltage. However the "dimness" is due to the fact that a smaller duty cycle would mean fewer photons emitted for the same time period and hence the lack of intensity?!

I cannot see why. Average current is usually what we are worried about with LEDs as @DaveE has eluded to. But, if you can understand one, why not the other? When you alternately dump cold and hot water into a bucket can you see how the average forms a warm temp? I know people on here hate analogies, flame away. Lol.
I think I get it now. I think I needed to revisit the definition of intensity. My understanding now is that fewer photons are emitted for a shorter duty cycle and hence the dimness (?)

Averagesupernova
Gold Member
I think the actual intensity as you see it is more about perception.
-
The same way we see motion pictures as smooth rather than the choppy still images they actually are. It's about perception.

DaveE
Averagesupernova
Gold Member
The off period of the cycle results in an LED that is actually emitting NO photons. A phototransistor hooked to a scope will show it. I've built PWM circuits that carry audio over a light beam. I've seen this.

JC2000
I think the actual intensity as you see it is more about perception.
But for the same time period, if one duty cycle (the dimmer one) is lesser than another, then doesn't the dimmer one actually emit fewer photons for that time? I get that the flickering is seen as a continuous glow due to persistence of vision.

Averagesupernova
Gold Member
But for the same time period, if one duty cycle (the dimmer one) is lesser than another, then doesn't the dimmer one actually emit fewer photons for that time? I get that the flickering is seen as a continuous glow due to persistence of vision.
Yes for the period of the signal there are less photons.

JC2000
Borek
Mentor
Why does flickering affect the brightness?
Because the eye integrates and averages over time.

russ_watters
DaveE
Gold Member
Since intensity is the number of photons per unit area at a given time.
Would it be correct to conclude that in reality the voltage applied across the LED is always the max voltage. However the "dimness" is due to the fact that a smaller duty cycle would mean fewer photons emitted for the same time period and hence the lack of intensity?!
Yes, exactly.

JC2000 and russ_watters
russ_watters
Mentor
The same way we see motion pictures as smooth rather than the choppy still images they actually are. It's about perception.
er, also dimmer than they actually are --- this example is exactly the issue of the OP.

[for older style projectors]

JC2000 and Averagesupernova