# Resisting 12volts

DJM111188
I am trying to get a light in a gauge in my car to become dimmer. How do I go about doing so? I want to keep the voltage at 12 volts but I want to lower the amperage by half. Would I have to by a 2ohm 72watt resistor in order to do this?

Also I trying to find some sort of learning kit that is inexpensive that I may get in order to further learn how electronics work. Any suggestions on a good site, kit, or perhaps even a good book?

## Answers and Replies

You would need to know how much current the lamp would use with 6 volts across it.

If you can arrange 6 volts, just measure the current into the lamp.
It is unlikely to be 6 amps. Possibly 0.3 amps or so.

Be sure that this current is not also powering the gauge sensor.

Once you know this current, just use Ohms Law to work out the resistance needed to drop 6 volts at that current.

For example, if the current was 0.3 amps, the resistor would be (6 volts / 0.3 amps) or 20 ohms.
A 22 ohm resistor rated at better than 1.6 watts would be needed. (6V * 6V / 22 ohms = 1.6 watts)

If you can't arrange 6 volts, measure the cold resistance of the lamp and work from that, although the lamp resistance will increase when it gets hot.

Phrak
The resistance of a tungsten filament increases substantially on heating (to about 3000K); about 19 times as much as its cold resistance.

I'm guessing your light is a common incandescent light. If so, and to keep it simple, place another bulb of the same wattage in series to divide the current by two.

I took a typical 12 volt dial lamp bulb and measured the current at different voltages.

Here are the results:

12 V 280 mA ie 42.8 ohms

10 V 250 mA 40 ohms

8 V 230 mA 34.7 ohms

6 V 190 mA 31.6 ohms

4 V 160 mA 25 ohms

COLD 19 ohms

So, that is a ratio of 2.2 to 1 from hot to cold.

Phrak
That's interesting vk6kro. By the way, I posted without having had your post available, as I was researching. One number of interest seems to be the ratio 42.8 to 31.6.

What I did was search the internet for the temperature of a working tungsten filament (2800 to 3300 K). I then found a reference for the resistivity of tungsten @ temperature: 5.5 uOhm-cm at 20C vs 105 uOhm-cm at 3000C.

Hmm. I should have interpolated to 90 uOhm-cm, which would put the ratio at 16 to 1. I don't know why our numbers are in so much disagreement.

It strikes me that filaments are probably an alloy rather than pure tungsten. Sure enough, a short search implies a tungsten alloys in use. A fruitful search on the resistivity of a typical allow doesn't seem likely to succeed.

Last edited:
DJM111188
I do appreciate your insights. Though this is going a little over my head. I hadn't realized that a bulb would have a difference in resistance in relation to temperature. Also I don't know whether or not it's important to note, but the light is a series of LEDs. The particular gauge I have is extremely bright. So I assumed that I would be able to just solder in a resistor to dim the gauge to a usable range. That being said I own many gauges and have ones with regular incandescent bulbs.

Now vk6kro what do you mean when you say if I can arrange 6 volts? Does that mean that I should use a 6 volt power source just for testing purposes? The reason I wanted to stick with the 12 volts is because the gauge was designed to run on a standard 12 volt system.

When I asked about the 2ohm by 72watt resistor it was on the basis of just 12 volts with no load. I didn't take into account the draw of the LED. So let me see if I have this straight.... In order to determine the correct resistor for the desired current I need to measure the resistance of the load( in this case an LED or Lightbulb) and then what?

I grabbed this formula off a car audio website. It's describing how to determine the proper resistor for a LED on a 12volt system.

Working voltage (Vf)=1.8 volts
Desired current flow=15ma (.015 amps)
Power supply voltage=12 volts

12-1.8=10.2
10.2/.015=680 ohms
.015*10.2=.153 watts

So I assume working voltage means the amount of volts it uses, right? So then 12 volts minues 1.8 volts leaves 10.2 volts left. So since it draws a current of .015 I need to find a resistor that will drop the 10.2 to that mA. Which means I need a 680 ohm resistor that can handle at least .153 watts. Is this accurate?

DJM111188
Also can you please tell me what will happen if say I use a head light bulb that is rated at 12v/55watts and apply it to a 24 volt circuit with only 2.29amps? If it's a 12v bulb I would imagine that it cannot go any higher then that, but seeing as I don't know much about electrical properties perhaps I am wrong?

I really appreciate the info you guys are providing me with. I hope I'm not being a bother.

Yes, LEDs do make a difference. If you have LEDs, the voltage across them depends on the colour of the LED. White LEDs need about 3 to 3.5 volts.
You can read about it here if you like:
http://en.wikipedia.org/wiki/LED

Also, if the LEDs are in series, this matters, too.

Your formula is right, except that you have to allow for the colour of the LEDs and how many are in series.

You will find a resistor in series with your LEDs. You could try measurng this resistor and then buying a resistor which has twice as much resistance as the one that is in there now.
Or take it to a store and get someone to measure it for you if you don't have a meter.
They only cost a few cents.

The 6 volt supply was a test setup that you would use to measure how much current a bulb would draw at 6 volts. Because the resistance is a bit unpredictable, this is a reliable way of doing it.

If you put 24 volts across a 12 volt lamp it would glow very brightly for a second or two then go dark forever.
If your supply could only deliver 2.29 amps, you would not be able to get 24 volts across that lamp. In fact, you would not be able to put more than 6 volts across the lamp.

Ignoring lamp non-linearity, the lamp would have a resistance of 2.6 ohms. (12 v * 12 v / 55 Watts)
At 6 volts this would draw 2.29 amps.
So, any more than 6 volts would cause the fuse to blow or whatever happens when that supply is overloaded.

DJM111188
I'm not sure I follow you about when you say I wouldn't be able to put 6 volts across a 12volt rated lamp? I used the 2.29amp value because at 24volts that would equal 55watts. So I was seeing wheter or not 55watts is universal or not. Obviously from what you are saying, it is not...but why is that?

Also I do have a multi meter,quite an expensive one at that, so now I should be able to find some use for it aside from measuring injector ohms. Again thanks for the input

StkMtd
I do appreciate your insights. Though this is going a little over my head. I hadn't realized that a bulb would have a difference in resistance in relation to temperature. Also I don't know whether or not it's important to note, but the light is a series of LEDs. The particular gauge I have is extremely bright. So I assumed that I would be able to just solder in a resistor to dim the gauge to a usable range. That being said I own many gauges and have ones with regular incandescent bulbs.

Now vk6kro what do you mean when you say if I can arrange 6 volts? Does that mean that I should use a 6 volt power source just for testing purposes? The reason I wanted to stick with the 12 volts is because the gauge was designed to run on a standard 12 volt system.

When I asked about the 2ohm by 72watt resistor it was on the basis of just 12 volts with no load. I didn't take into account the draw of the LED. So let me see if I have this straight.... In order to determine the correct resistor for the desired current I need to measure the resistance of the load( in this case an LED or Lightbulb) and then what?

I grabbed this formula off a car audio website. It's describing how to determine the proper resistor for a LED on a 12volt system.

Working voltage (Vf)=1.8 volts
Desired current flow=15ma (.015 amps)
Power supply voltage=12 volts

12-1.8=10.2
10.2/.015=680 ohms
.015*10.2=.153 watts

So I assume working voltage means the amount of volts it uses, right? So then 12 volts minues 1.8 volts leaves 10.2 volts left. So since it draws a current of .015 I need to find a resistor that will drop the 10.2 to that mA. Which means I need a 680 ohm resistor that can handle at least .153 watts. Is this accurate?

RESISTOR_VALUE = (SUPPLY_VOLTAGE - (V_f * NUM_LEDS)) / LED_CURRENT

The thing with LEDs is they have very little internal resistance (unlike a tungsten filament). If you hook one up without a resistor, small changes in voltage will create much larger changes in current, and this is what kills your LED (in an unsatisfying plume of black smoke).

There are some things you should find out:

1. Are they hooked up in series? parallel? parallel series?
2. What is the V_f rating, and recommended current level?

The equation I stuck up there works for a chain of serial LEDs (with the resistor at the supply). It basically says: Given V = IR, R = V/I, V = Total voltage available minus the number of LEDs times their voltage (which leaves you with the voltage the resistor needs to dissipate), divided by the current that the resistor should pass. Very handy formula.

In my experience, the current supplied to an LED is not in linear proportion to perceived brightness. It might not be as easy as doubling the resistor value. You might have to play with values till you get the brightness you want.

My rule of thumb is to calculate the needed resistor for meeting the minimum current requirement, and the maximum current requirement, then I generally pick the closest value I have that is greater than the minimum, but always less than the maximum. Most manufacturers will specify a range of current ratings for LEDs.

If you like programming at all, you could look into an Arduino. It's a fun little hobby protoboard with a processor built in which will let you do all kinds of things with 5V logic (which is easily expanded using transistors and relays). I got one, learned it for a month, and quickly moved on to programming AVR microcontrollers without the arduino platform. It's a bit of an addictive hobby.

Last edited:
DJM111188
RESISTOR_VALUE = (SUPPLY_VOLTAGE - (V_f * NUM_LEDS)) / LED_CURRENT

The thing with LEDs is they have very little internal resistance (unlike a tungsten filament). If you hook one up without a resistor, small changes in voltage will create much larger changes in current, and this is what kills your LED (in an unsatisfying plume of black smoke).

There are some things you should find out:

1. Are they hooked up in series? parallel? parallel series?
2. What is the V_f rating, and recommended current level?

The equation I stuck up there works for a chain of serial LEDs (with the resistor at the supply). It basically says: Given V = IR, R = V/I, V = Total voltage available minus the number of LEDs times their voltage (which leaves you with the voltage the resistor needs to dissipate), divided by the current that the resistor should pass. Very handy formula.

In my experience, the current supplied to an LED is not in linear proportion to perceived brightness. It might not be as easy as doubling the resistor value. You might have to play with values till you get the brightness you want.

My rule of thumb is to calculate the needed resistor for meeting the minimum current requirement, and the maximum current requirement, then I generally pick the closest value I have that is greater than the minimum, but always less than the maximum. Most manufacturers will specify a range of current ratings for LEDs.

If you like programming at all, you could look into an Arduino. It's a fun little hobby protoboard with a processor built in which will let you do all kinds of things with 5V logic (which is easily expanded using transistors and relays). I got one, learned it for a month, and quickly moved on to programming AVR microcontrollers without the arduino platform. It's a bit of an addictive hobby.

Hey man I really appreciate your input. This is very easy for me to understand. Also I just checked out the Arduino web page and I'm definitely going to give them a shot. Again thanks for the insight. Gold Member
Did you ever consider just using a lower power 12V bulb? They exist in a range of powers - just like domestic light bulbs.

DJM111188
The LED's are soldered onto a pcb board(I'm not sure if that's the proper terminology). Which are inside the gauge. Only way I can easily lower the intensity is to wire up a resistor.

StkMtd
The LED's are soldered onto a pcb board(I'm not sure if that's the proper terminology). Which are inside the gauge. Only way I can easily lower the intensity is to wire up a resistor.

no need to add board to pcb. PCB = Printed Circuit Board.

Do you have a soldering iron and a multimeter to do this work? If you had some pictures of the board, that might help as well (so we can see what's going on)

Also, If you could get the layout of the LEDs and resistor, a voltage reading on one of the LEDs (while powered by 12V of course), then we could easily find the voltage drop on the resistor, and the necessary current for the LEDs. Totally doable in an afternoon if you have the tools and the right resistor. It might be best if you had a potentiometer (variable resistor) so you can twist a knob to work out the brightness you want. You may even be able to install a knob permanently to get a brigtness fader on the LEDs (which would definitely impress your friends :P).

DJM111188
I do have a multimeter and soldering iron. Originally I was going to just use an old dimming switch from a junk car and wire it solely to this gauge but I decided to go the route of wiring in a resistor for various reasons. I now know what I need to do though and I thank everyone for their input.