How to convert AC Watts to DC Watts


by John1397
Tags: convert, watts
John1397
John1397 is offline
#1
May21-12, 09:43 PM
P: 65
I do not work with this type of conversion so I am not familiar on how to understand so here is my problem I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?

John
Phys.Org News Partner Engineering news on Phys.org
Lifting the brakes on fuel efficiency
PsiKick's batteryless sensors poised for coming 'Internet of things'
Researcher launches successful tech start-up to help the blind
coolul007
coolul007 is offline
#2
May21-12, 10:20 PM
coolul007's Avatar
P: 234
Quote Quote by John1397 View Post
I do not work with this type of conversion so I am not familiar on how to understand so here is my problem I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?

John
The key to watts is that in a resistive load the conversion is one for one. However, it depends on the current the circuit draws. P = watts, I = amperes(current) E = voltage, and R = resistance in Ohms. The following formulas work for DC and AC if the loads are resistance and not inductive or capacitance. P = I^2 R, and E=IR, so the 10 = 125I I = .o8 amps, and the resistance = E/I = 125/.08 = 1562.5 ohms. In a 17 volt circuit the current will be I = 17/1562.5 = 0.01088 amps, so the power consumed by the 10 watt bulb would be P=I^2R = ((0.01088)^2)(1562.5)=0.18496 watts in the 17 volt circuit. The brightness of the bulb would not be the same.
vk6kro
vk6kro is offline
#3
May22-12, 03:47 AM
Sci Advisor
P: 4,003
Lamps are not all equally efficient, but 10 watts is the same wherever it comes from. So, the light output for a given power input will not always be the same with different lamp types.

Notice, though, that if a 125 volt lamp is run on 17 volts, the power used will be a lot less than if it was run on 125 volts. This is because the lamp will draw a much smaller current on 17 volts than it would on 125 volts.

The actual current is a little hard to predict since the lamp filament will run cooler on 17 volts and so it will have a higher resistance than it does when it runs hot with 125 volts on it.

Power is just the product of voltage and current (for resistive loads like lamps) and it doesn't matter if the voltage is AC or DC.

John1397
John1397 is offline
#4
May22-12, 06:35 AM
P: 65

How to convert AC Watts to DC Watts


I do not know if that answers my question, but put another way if you have two bulbs:

125 volt .0800 amp = 10 watt
12 volt .8333 amp = 10 watt

both of the bulbs are rated 10 watts one has higher volts and the other higher amps, but both bulbs should give the same amount of light and if you had to buy current from the power company you would pay the same amount for both bulbs would you not? This would work the same if you had two 10 watt heaters with different voltages they would both give the same amount of heat would they not?

John
psparky
psparky is offline
#5
May22-12, 06:45 AM
P: 659
Quote Quote by John1397 View Post
I do not know if that answers my question, but put another way if you have two bulbs:

125 volt .0800 amp = 10 watt
12 volt .8333 amp = 10 watt

both of the bulbs are rated 10 watts one has higher volts and the other higher amps, but both bulbs should give the same amount of light and if you had to buy current from the power company you would pay the same amount for both bulbs would you not?

John
You do not buy current from the power company....you by watts! P=IV.

If I understand your question correctly........hooking a 125 volt, 10 watt light to a 12 volt source will not give you .8333 amps at 10 watts.

The power is based off the resistance of the original light.......V=IR.

ONce you find the resitance of your original light, you must now use V=IR again when hooking up to the 12 volt source. Do the math and you will see very low current and watts coming out of your light when hooked up to the 12 volt source.

P=IV does not determine V=IR.

Rather, V=IR determines your P=IV.

10 watt heaters will give the same heat....but again.....V=IR must be determined first to get your watts correctly.

To make sure you got it....If I take your 125 volt light and hook it to three different sources....say 240 v, 120 v and 12 volt.
The three voltages all see the same Resistance. Each one will have it's own current based off of V=IR. Therefore, each one will put out different watts based off of P=IV after the current has been determined.
FOIWATER
FOIWATER is offline
#6
May22-12, 07:17 AM
PF Gold
FOIWATER's Avatar
P: 369
I agree with the last post, a simpler way to state this (not to take away from it)

Is this, a 60 watt lamp, is only 60 watts at rated voltage. The voltage, and resistance determine the current, not the wattage.

The voltage applied, and ultimately current drawn, yields the wattage.

A 60 watt, 120 volt lamp is only 60 watt at 120 volts, apply less than that, you will draw less current and receive less wattage
psparky
psparky is offline
#7
May22-12, 07:50 AM
P: 659
One more thing to john.......P does equal IV in both AC and DC....just make sure when you are in RMS when doing your calculations in AC....not peak value of your sin wave.

Simply divide the peak value of your sin wave by the square root of 2 to find RMS.

The calculation is more complicated for non sin wave inputs, but for now this should suffice for you.

Also, when talking about AC phasors (vectors).....these vectors must always be expressed in RMS......otherwise they are not correct because a spinning vector must maintain a constant magnitude which is the case with RMS. (root mean square)
NascentOxygen
NascentOxygen is offline
#8
May22-12, 09:12 AM
HW Helper
P: 4,715
Quote Quote by John1397 View Post
I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?
If you want your 125VAC bulb to glow exactly as brightly on DC, then you must supply it with 125VDC.

I assume we are talking about ordinary (i.e., incandescent) bulbs.
DragonPetter
DragonPetter is offline
#9
May22-12, 09:49 AM
P: 834
If you know the AC - peak to peak (pk) voltage and current of a load R, the expression for Power in rms - DC is:

[itex]P_{rms} = V_{rms}*I_{rms} = \frac{V_{pk}}{2\sqrt{2}} * \frac{I_{pk}}{2\sqrt{2}} = \frac{V_{pk}^{2}}{8*R}[/itex]
sophiecentaur
sophiecentaur is online now
#10
May24-12, 03:02 PM
Sci Advisor
PF Gold
sophiecentaur's Avatar
P: 11,354
A Watt is a Watt is a Watt. To work out the mean power when the voltage is varying, you have to add up the (I times V) 's for every instant of time (i.e. you need to integrate IV over time). If your AC waveform is sinusoidal then the 'root two' factor comes into the answer but, if the AC waveform is not a sinusoid (as with some cheapo inverters) there is no simple conversion factor. For a pure square wave (+/- about zero), aamof, the RMS (equivalent DC) voltage is exactly the same as the peak voltage.


Register to reply

Related Discussions
Covert to Watts, Input and Output Values provided, Covert it to Watts Introductory Physics Homework 1
Watts Introductory Physics Homework 14
Watts (HW) Engineering, Comp Sci, & Technology Homework 4
x Watts to run = x Watts of heat? Classical Physics 27
convert Watts to ohms ? Introductory Physics Homework 8