# How to convert AC Watts to DC Watts

1. May 21, 2012

### John1397

I do not work with this type of conversion so I am not familiar on how to understand so here is my problem I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?

John

2. May 21, 2012

### coolul007

The key to watts is that in a resistive load the conversion is one for one. However, it depends on the current the circuit draws. P = watts, I = amperes(current) E = voltage, and R = resistance in Ohms. The following formulas work for DC and AC if the loads are resistance and not inductive or capacitance. P = I^2 R, and E=IR, so the 10 = 125I I = .o8 amps, and the resistance = E/I = 125/.08 = 1562.5 ohms. In a 17 volt circuit the current will be I = 17/1562.5 = 0.01088 amps, so the power consumed by the 10 watt bulb would be P=I^2R = ((0.01088)^2)(1562.5)=0.18496 watts in the 17 volt circuit. The brightness of the bulb would not be the same.

3. May 22, 2012

### vk6kro

Lamps are not all equally efficient, but 10 watts is the same wherever it comes from. So, the light output for a given power input will not always be the same with different lamp types.

Notice, though, that if a 125 volt lamp is run on 17 volts, the power used will be a lot less than if it was run on 125 volts. This is because the lamp will draw a much smaller current on 17 volts than it would on 125 volts.

The actual current is a little hard to predict since the lamp filament will run cooler on 17 volts and so it will have a higher resistance than it does when it runs hot with 125 volts on it.

Power is just the product of voltage and current (for resistive loads like lamps) and it doesn't matter if the voltage is AC or DC.

4. May 22, 2012

### John1397

I do not know if that answers my question, but put another way if you have two bulbs:

125 volt .0800 amp = 10 watt
12 volt .8333 amp = 10 watt

both of the bulbs are rated 10 watts one has higher volts and the other higher amps, but both bulbs should give the same amount of light and if you had to buy current from the power company you would pay the same amount for both bulbs would you not? This would work the same if you had two 10 watt heaters with different voltages they would both give the same amount of heat would they not?

John

Last edited: May 22, 2012
5. May 22, 2012

### psparky

You do not buy current from the power company....you by watts! P=IV.

If I understand your question correctly........hooking a 125 volt, 10 watt light to a 12 volt source will not give you .8333 amps at 10 watts.

The power is based off the resistance of the original light.......V=IR.

ONce you find the resitance of your original light, you must now use V=IR again when hooking up to the 12 volt source. Do the math and you will see very low current and watts coming out of your light when hooked up to the 12 volt source.

P=IV does not determine V=IR.

10 watt heaters will give the same heat....but again.....V=IR must be determined first to get your watts correctly.

To make sure you got it....If I take your 125 volt light and hook it to three different sources....say 240 v, 120 v and 12 volt.
The three voltages all see the same Resistance. Each one will have it's own current based off of V=IR. Therefore, each one will put out different watts based off of P=IV after the current has been determined.

Last edited: May 22, 2012
6. May 22, 2012

### FOIWATER

I agree with the last post, a simpler way to state this (not to take away from it)

Is this, a 60 watt lamp, is only 60 watts at rated voltage. The voltage, and resistance determine the current, not the wattage.

The voltage applied, and ultimately current drawn, yields the wattage.

A 60 watt, 120 volt lamp is only 60 watt at 120 volts, apply less than that, you will draw less current and receive less wattage

7. May 22, 2012

### psparky

One more thing to john.......P does equal IV in both AC and DC....just make sure when you are in RMS when doing your calculations in AC....not peak value of your sin wave.

Simply divide the peak value of your sin wave by the square root of 2 to find RMS.

The calculation is more complicated for non sin wave inputs, but for now this should suffice for you.

Also, when talking about AC phasors (vectors).....these vectors must always be expressed in RMS......otherwise they are not correct because a spinning vector must maintain a constant magnitude which is the case with RMS. (root mean square)

8. May 22, 2012

### Staff: Mentor

If you want your 125VAC bulb to glow exactly as brightly on DC, then you must supply it with 125VDC.

I assume we are talking about ordinary (i.e., incandescent) bulbs.

9. May 22, 2012

### DragonPetter

If you know the AC - peak to peak (pk) voltage and current of a load R, the expression for Power in rms - DC is:

$P_{rms} = V_{rms}*I_{rms} = \frac{V_{pk}}{2\sqrt{2}} * \frac{I_{pk}}{2\sqrt{2}} = \frac{V_{pk}^{2}}{8*R}$

10. May 24, 2012

### sophiecentaur

A Watt is a Watt is a Watt. To work out the mean power when the voltage is varying, you have to add up the (I times V) 's for every instant of time (i.e. you need to integrate IV over time). If your AC waveform is sinusoidal then the 'root two' factor comes into the answer but, if the AC waveform is not a sinusoid (as with some cheapo inverters) there is no simple conversion factor. For a pure square wave (+/- about zero), aamof, the RMS (equivalent DC) voltage is exactly the same as the peak voltage.