- 18

- 1

Hi all, I'm working on an automotive LED lighting project in which the supply voltage is from the battery/alternator and so varies 12 to 14V. I'm trying to find a way to drop the LED circuit supply voltage that would be better than using resistors. Here's the current design:

The problem I see with the above circuit is R1 must dissipate

P = I*I*R ≈ 0.120A * 0.120A * 12V ≈ 1.5 W

which is rather large considering I would prefer to use 1/4 or 1/8 W rated resistors.

So that got me thinking in a more general sense about the efficiency of voltage dividing circuits that use resistors. There must be some other circuit that I can use that would be more efficient and not that much more complex. Something like a transformer but for DC. I read a bit about linear regulators and SMPS but I'm quite sure how to use those. Any tips or ideas?

I'm a newb with electrons so Thanks in advance for any help!

*R1 = 100Ω*

R2 = 3.9kΩ

Vsupply = 14V

LEDs: 20mA @ 2VR2 = 3.9kΩ

Vsupply = 14V

LEDs: 20mA @ 2V

The problem I see with the above circuit is R1 must dissipate

P = I*I*R ≈ 0.120A * 0.120A * 12V ≈ 1.5 W

which is rather large considering I would prefer to use 1/4 or 1/8 W rated resistors.

So that got me thinking in a more general sense about the efficiency of voltage dividing circuits that use resistors. There must be some other circuit that I can use that would be more efficient and not that much more complex. Something like a transformer but for DC. I read a bit about linear regulators and SMPS but I'm quite sure how to use those. Any tips or ideas?

I'm a newb with electrons so Thanks in advance for any help!

Last edited: