How to convert AC Watts to DC Watts

Click For Summary

Discussion Overview

The discussion centers on the conversion of AC watts to DC watts, specifically addressing the implications of using a 125 volt AC bulb rated at 10 watts with a DC supply of 0.6 amps at 17 volts. Participants explore the relationship between voltage, current, and power in resistive loads, as well as the efficiency of different types of bulbs.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants assert that power is equivalent in resistive loads regardless of whether it is AC or DC, but the actual brightness of bulbs may vary based on their design and operating voltage.
  • One participant calculates that a 125 volt, 10 watt bulb would consume significantly less power when connected to a 17 volt supply, suggesting it would not produce the same brightness.
  • Another participant emphasizes that a bulb rated at a specific wattage will only draw that wattage at its rated voltage, and applying a lower voltage results in lower current and power consumption.
  • There is a discussion about the importance of resistance in determining current and power when changing the voltage supplied to a bulb.
  • Some participants mention that the efficiency of bulbs can differ, meaning that two bulbs rated at the same wattage may not produce the same light output under different voltage conditions.
  • A participant notes that to achieve the same brightness from a 125 volt bulb on DC, it must be supplied with 125VDC, indicating the need for matching voltage for consistent performance.
  • There is mention of the RMS (root mean square) calculations for AC power, highlighting the complexity of conversions when dealing with non-sinusoidal waveforms.

Areas of Agreement / Disagreement

Participants express differing views on the implications of using AC versus DC for powering bulbs, with no consensus reached on the exact outcomes of using a 125 volt bulb with a lower voltage DC supply. The discussion remains unresolved regarding the efficiency and brightness of bulbs under varying voltage conditions.

Contextual Notes

Participants highlight the dependence on the resistance of the bulbs and the assumptions about their efficiency, as well as the complexities introduced by different types of AC waveforms. The calculations presented rely on specific conditions that may not apply universally.

John1397
Messages
189
Reaction score
18
I do not work with this type of conversion so I am not familiar on how to understand so here is my problem I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?

John
 
Engineering news on Phys.org
John1397 said:
I do not work with this type of conversion so I am not familiar on how to understand so here is my problem I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?

John

The key to watts is that in a resistive load the conversion is one for one. However, it depends on the current the circuit draws. P = watts, I = amperes(current) E = voltage, and R = resistance in Ohms. The following formulas work for DC and AC if the loads are resistance and not inductive or capacitance. P = I^2 R, and E=IR, so the 10 = 125I I = .o8 amps, and the resistance = E/I = 125/.08 = 1562.5 ohms. In a 17 volt circuit the current will be I = 17/1562.5 = 0.01088 amps, so the power consumed by the 10 watt bulb would be P=I^2R = ((0.01088)^2)(1562.5)=0.18496 watts in the 17 volt circuit. The brightness of the bulb would not be the same.
 
Lamps are not all equally efficient, but 10 watts is the same wherever it comes from. So, the light output for a given power input will not always be the same with different lamp types.

Notice, though, that if a 125 volt lamp is run on 17 volts, the power used will be a lot less than if it was run on 125 volts. This is because the lamp will draw a much smaller current on 17 volts than it would on 125 volts.

The actual current is a little hard to predict since the lamp filament will run cooler on 17 volts and so it will have a higher resistance than it does when it runs hot with 125 volts on it.

Power is just the product of voltage and current (for resistive loads like lamps) and it doesn't matter if the voltage is AC or DC.
 
I do not know if that answers my question, but put another way if you have two bulbs:

125 volt .0800 amp = 10 watt
12 volt .8333 amp = 10 watt

both of the bulbs are rated 10 watts one has higher volts and the other higher amps, but both bulbs should give the same amount of light and if you had to buy current from the power company you would pay the same amount for both bulbs would you not? This would work the same if you had two 10 watt heaters with different voltages they would both give the same amount of heat would they not?

John
 
Last edited:
John1397 said:
I do not know if that answers my question, but put another way if you have two bulbs:

125 volt .0800 amp = 10 watt
12 volt .8333 amp = 10 watt

both of the bulbs are rated 10 watts one has higher volts and the other higher amps, but both bulbs should give the same amount of light and if you had to buy current from the power company you would pay the same amount for both bulbs would you not?

John

You do not buy current from the power company...you by watts! P=IV.

If I understand your question correctly...hooking a 125 volt, 10 watt light to a 12 volt source will not give you .8333 amps at 10 watts.

The power is based off the resistance of the original light...V=IR.

ONce you find the resitance of your original light, you must now use V=IR again when hooking up to the 12 volt source. Do the math and you will see very low current and watts coming out of your light when hooked up to the 12 volt source.

P=IV does not determine V=IR.

Rather, V=IR determines your P=IV.

10 watt heaters will give the same heat...but again...V=IR must be determined first to get your watts correctly.

To make sure you got it...If I take your 125 volt light and hook it to three different sources...say 240 v, 120 v and 12 volt.
The three voltages all see the same Resistance. Each one will have it's own current based off of V=IR. Therefore, each one will put out different watts based off of P=IV after the current has been determined.
 
Last edited:
I agree with the last post, a simpler way to state this (not to take away from it)

Is this, a 60 watt lamp, is only 60 watts at rated voltage. The voltage, and resistance determine the current, not the wattage.

The voltage applied, and ultimately current drawn, yields the wattage.

A 60 watt, 120 volt lamp is only 60 watt at 120 volts, apply less than that, you will draw less current and receive less wattage
 
One more thing to john...P does equal IV in both AC and DC...just make sure when you are in RMS when doing your calculations in AC...not peak value of your sin wave.

Simply divide the peak value of your sin wave by the square root of 2 to find RMS.

The calculation is more complicated for non sin wave inputs, but for now this should suffice for you.

Also, when talking about AC phasors (vectors)...these vectors must always be expressed in RMS...otherwise they are not correct because a spinning vector must maintain a constant magnitude which is the case with RMS. (root mean square)
 
John1397 said:
I have a electrical supply line putting out DC current at .6 amps by 17 volts which equals 10.2 watts here is my question if I take a 125 volt AC bulb rated 10 watts will this be enough power to burn the bulb as bright as both the output watts on DC supply and bulb are rated the same at 10 watts?
If you want your 125VAC bulb to glow exactly as brightly on DC, then you must supply it with 125VDC.

I assume we are talking about ordinary (i.e., incandescent) bulbs.
 
If you know the AC - peak to peak (pk) voltage and current of a load R, the expression for Power in rms - DC is:

P_{rms} = V_{rms}*I_{rms} = \frac{V_{pk}}{2\sqrt{2}} * \frac{I_{pk}}{2\sqrt{2}} = \frac{V_{pk}^{2}}{8*R}
 
  • #10
A Watt is a Watt is a Watt. To work out the mean power when the voltage is varying, you have to add up the (I times V) 's for every instant of time (i.e. you need to integrate IV over time). If your AC waveform is sinusoidal then the 'root two' factor comes into the answer but, if the AC waveform is not a sinusoid (as with some cheapo inverters) there is no simple conversion factor. For a pure square wave (+/- about zero), aamof, the RMS (equivalent DC) voltage is exactly the same as the peak voltage.
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 9 ·
Replies
9
Views
4K
Replies
21
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 10 ·
Replies
10
Views
5K
  • · Replies 39 ·
2
Replies
39
Views
5K
  • · Replies 6 ·
Replies
6
Views
6K