Change in watts and power dissipated

In summary, the conversation discusses how to calculate the power dissipated by a 100W light bulb when used in the United Kingdom, where the line voltage is 230V compared to the US's 120V. The formula P = V2/R is suggested and the resistance of the bulb is calculated to be 145 Ω. The final conclusion is that the bulb would dissipate approximately 370W of power under these conditions.
  • #1
aChordate
76
0
One of my homework questions is giving me trouble:

"In, the US, the rms voltage from power outlets (known as line voltage) is 120 V. In the United Kingdom, line voltage is 230 V. If you take a lamp with a standard 100 W incandescent light bulb from the US, how much power will it dissipate if used in the UK? Assume that the electrical resistance of the bulb is constant."


So, I know that V=IR and P=VI. V1=120 V and V2= 230 V and the P of the light bulb= 100 W. Should I plug in power of the light bulb to the formula P=IV and find the current and then use that to find the excess power. For example,

P/V=I

(100W)/(120V)=I

I=0.8A

P=(0.8A)(230 V)= 200 W - 100 W = 100 W dissipated

I have no idea if I am on the right track or not, please help.

Thanks in advance.
 
Physics news on Phys.org
  • #2
aChordate said:
One of my homework questions is giving me trouble:

"In, the US, the rms voltage from power outlets (known as line voltage) is 120 V. In the United Kingdom, line voltage is 230 V. If you take a lamp with a standard 100 W incandescent light bulb from the US, how much power will it dissipate if used in the UK? Assume that the electrical resistance of the bulb is constant."


So, I know that V=IR and P=VI. V1=120 V and V2= 230 V and the P of the light bulb= 100 W. Should I plug in power of the light bulb to the formula P=IV and find the current and then use that to find the excess power.
bolding mine

No. Find the resistance of the lightbulb. As stated; "Assume that the electrical resistance of the bulb is constant."
For example,

P/V=I

(100W)/(120V)=I

I=0.8A

P=(0.8A)(230 V)= 200 W - 100 W = 100 W dissipated

I have no idea if I am on the right track or not, please help.

Thanks in advance.

You are welcome.
 
  • #3
So, should I calculate it like this:

P/V=I

(100W)/(120V)=I

I=0.83A

R=V/I = 120V/0.83A = 145 Ω

R=100 Ω

and then would I use the formula P=I^2*R ?
 
  • #4
There's another power relation that might prove to be helpful: P = V2/R.
 
  • #5
P/V=I
(100W)/(120V)=I
I=0.83A

R=V/I = 120V/0.83A = 145 Ω
R=100 Ω

Then,
P = V2/R
P=(230V)^2/100Ω=529 W

So,
529W-100W = 429W or 400W dissipated?
 
  • #6
aChordate said:
P/V=I
(100W)/(120V)=I
I=0.83A

R=V/I = 120V/0.83A = 145 Ω
R=100 Ω
Why do you write R = 100 Ω after calculating it to be 145 Ω ?
Then,
P = V2/R
P=(230V)^2/100Ω=529 W

So,
529W-100W = 429W or 400W dissipated?

Does the question ask you to find the actual power dissipated or the difference in power dissipated? When I read the quoted question, it looks like they just want the power dissipated by that bulb when it's energized at 230V.
 
  • #7
Oh, oops, I was going by sig. figs. I recalculated and got 370 W dissipated.

I thought by "dissipated" meant the extra power that isn't being used by the 100W light bulb and that is why I found the difference. Hmmm, I guess not.
 
  • #8
Also, thank you so much for you input!
 
  • #9
aChordate said:
Also, thank you so much for you input!


Your 370W value looks quite reasonable. You're welcome :smile:
 

1. What is the difference between watts and power dissipated?

Watts and power dissipated are related concepts, but they have different definitions. Watts refer to the rate of energy transfer or consumption, while power dissipated is the actual amount of energy converted into heat or work. In other words, watts represent the rate of change of power dissipated.

2. How is change in watts calculated?

The change in watts can be calculated by finding the difference between the initial and final power values. This can be done by subtracting the initial power from the final power, or by using the formula ΔP = Pfinal - Pinitial.

3. What factors can cause a change in watts?

Several factors can cause a change in watts, including changes in the voltage or current of a circuit, changes in the resistance of a circuit, and changes in the efficiency of a system. Additionally, external factors such as temperature and environmental conditions can also affect the rate of energy transfer and thus the change in watts.

4. How does a change in watts affect the performance of a system?

A change in watts can directly affect the performance of a system, as it represents the rate of energy transfer. If there is a significant increase or decrease in watts, it can impact the efficiency and functioning of the system. For example, a higher wattage may lead to overheating and potential damage, while a lower wattage may result in decreased performance.

5. Can a change in watts be controlled or regulated?

Yes, a change in watts can be controlled or regulated by adjusting the factors that affect power dissipation. For example, in a circuit, the voltage and current can be controlled to regulate the power dissipated. In a larger system, efficiency can be improved through design or maintenance to minimize the change in watts.

Similar threads

  • Introductory Physics Homework Help
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
742
  • Introductory Physics Homework Help
Replies
3
Views
939
  • Introductory Physics Homework Help
Replies
9
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
798
  • Introductory Physics Homework Help
Replies
11
Views
2K
  • Introductory Physics Homework Help
Replies
15
Views
1K
  • Introductory Physics Homework Help
Replies
4
Views
1K
  • Introductory Physics Homework Help
Replies
4
Views
4K
  • Introductory Physics Homework Help
Replies
17
Views
1K
Back
Top