Change in watts and power dissipated

AI Thread Summary
The discussion revolves around calculating the power dissipated by a 100 W incandescent light bulb when used in the UK with a line voltage of 230 V, compared to its original use in the US at 120 V. Participants clarify that the resistance of the bulb remains constant, and the correct approach involves first determining the bulb's resistance using the formula R = V/I. After calculating the resistance, they apply the formula P = V²/R to find the power dissipated at the higher voltage. The final consensus indicates that the bulb would dissipate approximately 370 W when used in the UK, highlighting the importance of understanding how voltage affects power dissipation in electrical devices.
aChordate
Messages
76
Reaction score
0
One of my homework questions is giving me trouble:

"In, the US, the rms voltage from power outlets (known as line voltage) is 120 V. In the United Kingdom, line voltage is 230 V. If you take a lamp with a standard 100 W incandescent light bulb from the US, how much power will it dissipate if used in the UK? Assume that the electrical resistance of the bulb is constant."


So, I know that V=IR and P=VI. V1=120 V and V2= 230 V and the P of the light bulb= 100 W. Should I plug in power of the light bulb to the formula P=IV and find the current and then use that to find the excess power. For example,

P/V=I

(100W)/(120V)=I

I=0.8A

P=(0.8A)(230 V)= 200 W - 100 W = 100 W dissipated

I have no idea if I am on the right track or not, please help.

Thanks in advance.
 
Physics news on Phys.org
aChordate said:
One of my homework questions is giving me trouble:

"In, the US, the rms voltage from power outlets (known as line voltage) is 120 V. In the United Kingdom, line voltage is 230 V. If you take a lamp with a standard 100 W incandescent light bulb from the US, how much power will it dissipate if used in the UK? Assume that the electrical resistance of the bulb is constant."


So, I know that V=IR and P=VI. V1=120 V and V2= 230 V and the P of the light bulb= 100 W. Should I plug in power of the light bulb to the formula P=IV and find the current and then use that to find the excess power.
bolding mine

No. Find the resistance of the lightbulb. As stated; "Assume that the electrical resistance of the bulb is constant."
For example,

P/V=I

(100W)/(120V)=I

I=0.8A

P=(0.8A)(230 V)= 200 W - 100 W = 100 W dissipated

I have no idea if I am on the right track or not, please help.

Thanks in advance.

You are welcome.
 
So, should I calculate it like this:

P/V=I

(100W)/(120V)=I

I=0.83A

R=V/I = 120V/0.83A = 145 Ω

R=100 Ω

and then would I use the formula P=I^2*R ?
 
There's another power relation that might prove to be helpful: P = V2/R.
 
P/V=I
(100W)/(120V)=I
I=0.83A

R=V/I = 120V/0.83A = 145 Ω
R=100 Ω

Then,
P = V2/R
P=(230V)^2/100Ω=529 W

So,
529W-100W = 429W or 400W dissipated?
 
aChordate said:
P/V=I
(100W)/(120V)=I
I=0.83A

R=V/I = 120V/0.83A = 145 Ω
R=100 Ω
Why do you write R = 100 Ω after calculating it to be 145 Ω ?
Then,
P = V2/R
P=(230V)^2/100Ω=529 W

So,
529W-100W = 429W or 400W dissipated?

Does the question ask you to find the actual power dissipated or the difference in power dissipated? When I read the quoted question, it looks like they just want the power dissipated by that bulb when it's energized at 230V.
 
Oh, oops, I was going by sig. figs. I recalculated and got 370 W dissipated.

I thought by "dissipated" meant the extra power that isn't being used by the 100W light bulb and that is why I found the difference. Hmmm, I guess not.
 
Also, thank you so much for you input!
 
aChordate said:
Also, thank you so much for you input!


Your 370W value looks quite reasonable. You're welcome :smile:
 
Back
Top