How Does Increasing Voltage Reduce Power Loss in Electrical Transmission?

AI Thread Summary
Increasing the voltage in electrical transmission reduces power loss due to lower current flow, which minimizes resistive losses in the wires. The power station delivers 750 kW at 12,000 V, but the calculations mistakenly suggest it could deliver 48,000 kW, leading to confusion. The correct approach to calculate power loss involves understanding the voltage drop across the transmission line, not assuming all power is lost. By using the formula P = (ΔV)² / R, one can accurately determine the current and resulting losses. Ultimately, higher voltage transmission is more efficient, significantly reducing wasted power.
skepticwulf
Messages
74
Reaction score
1

Homework Statement


A power station delivers 750 kW of power at 12,000 V to a factory through wires with total resistance 3 Ohm. How much less power is wasted if the electricity is delivered at 50,000 V rather than 12,000 V?

Homework Equations


P=I x V, P= V^2/R, P=I^2 x R

The Attempt at a Solution


With given V1 and R values Power comes as:
P1= (12000)^2 / 3=48000kW, what is this 750kW then?
Is this station delivers power at 48000kW or 750kW??

Again, for V2 and R values:
P2=(50000)^2 / 3=833333kW
I would make the difference of figure P2 and P1 and obviously it's not right according to solution manual , but why? I'm totally lost.
 
Physics news on Phys.org
Can you draw a diagram of the circuit? Most power is delivered to the factory, a small fraction is lost in the transmission wires..
 
I'm afraid there was no picture associated with the problem given. Just the text above.
 
You still need a picture. What you calculate is for when at the factory the wires coming from the power plant are simply short-circuited and the 12000 V is seeing only the 3 Ohm from the wires ...
The fact that the exercise comes without a picture doesn't have to stop you from drawing one yourself. You need to know what you are doing, a picture helps !
 
skepticwulf said:

Homework Statement


A power station delivers 750 kW of power at 12,000 V to a factory through wires with total resistance 3 Ohm. How much less power is wasted if the electricity is delivered at 50,000 V rather than 12,000 V?

Homework Equations


P=I x V, P= V^2/R, P=I^2 x R

The Attempt at a Solution


With given V1 and R values Power comes as:
P1= (12000)^2 / 3=48000kW, what is this 750kW then?
Is this station delivers power at 48000kW or 750kW??

Again, for V2 and R values:
P2=(50000)^2 / 3=833333kW
I would make the difference of figure P2 and P1 and obviously it's not right according to solution manual , but why? I'm totally lost.
Here, the difficulty arises with your choice of formula to calculate the power loss in the transmission wires.

By using P = V2 / R to calculate the power loss, you are, in effect, assuming that there is zero potential remaining at the end of the transmission line, which would mean that all 750 kW from the plant was lost in transmission. In reality, this formula should be written P = (ΔV)2 / R, but you unfortunately do not know the voltage drop ΔV between the power station and the factory.

Since you have the same power being transmitted, but there are two different voltages under consideration, then the current transmitted in each scenario will be different. More importantly, current into the line from the power station will be the same current coming out of the line at the factory. By knowing the resistance of the transmission line, which is the same in each scenario, you can calculate the current coming out of the power plant.
 
Thank you
 
My previous post was a little hazy. What I meant to say was, by knowing the voltage in the transmission line, you can calculate the current coming out of the power plant in each scenario.
 
Back
Top