Electric power distribution from powerplant to homes

  • Thread starter Thread starter EN1986
  • Start date Start date
EN1986
Messages
5
Reaction score
0
I am trying to understand how transferring electric from the powerplant to my house is more effective using high voltage.

The suggested explanation that the current is equal to the power supply divided by the voltage, and hence higher voltage leads to lower current and as a result to a lower power loss on the conductives is very confusing me.

I know that the current is determined by the voltage and the resistance, and not by a power capability - which defines a limit to the allowable consumption. And according to this a higher voltage (while assuming a constand resistance that comprises of the line resistance and the load resistance) leads to a higher current that results in a higher power loss.

What am I missing?
 
Engineering news on Phys.org
Power lines are resistors. The more current that flows through them, the greater the voltage drop from source to user, and thus the more power is dissipated.
The total power delivered is volts times amps. So, given a certain amount of required power, higher voltage means less current and thus less transmission losses.
 
W = I ⋅ V ;
If you double the line voltage, then for the same power, the load current will be halved.
Then consider power lost in the line resistance, R.
Ohm's Law; V = I ⋅ R ;
W = I2 ⋅ R ;
By doubling the voltage, power loss in the line is reduced to one quarter.
 
EN1986 said:
I know that the current is determined by the voltage and the resistance, and not by a power capability - which defines a limit to the allowable consumption. And according to this a higher voltage (while assuming a constand resistance that comprises of the line resistance and the load resistance) leads to a higher current that results in a higher power loss.

What am I missing?
What you are saying is true for a simple circuit with one source and one load/resistor, but the power line isn't the only thing in in the circuit. The things the power line is feeding needs specific amperage, voltage and therefore power.
 
  • Like
Likes Delta Prime
What you are missing is that, since the wire resistance and the load resistance (let's keep it all resistive for simplicity) are in series, they share the same current, no matter what is the power required by the load.

Voltage is more or less the same at the transmitting station and at the receiving station - the difference being the (hopefully small) voltage drop in the wires. So the loss in the wires is given by Rwire x Iload^2 and for the same power at the load this loss is lower the smaller Iload is.
 
there's a distinction in voltage and voltage drop. voltage is determined by generation and by operating requirements of the system.

you seem to be confusing the two (system voltage and voltage drop). V_drop = i*r

voltage drop is calculated of irrespective of the nominal voltage of the system. it makes no difference if it's a 12kV, 25kV, 69kV, 230kV, or 500kV system, voltage loss is the same calculation.

power losses are similar. P_loss = i^2*r
 
EN1986 said:
What am I missing?
For one: the transformers are not just adjusting the voltage, they are also modifying the apparent resistance (well, more like impedance) accordingly. And, of course, the other transformer at the other end of the line which will set the lower voltage will also 'reset' the resistance.

Thus: the whole 'transmission through elevated voltages' business is kind of 'transparent'.
 
EN1986 said:
a higher voltage […] while assuming a constant resistance […] leads to a higher current that results in a higher power loss.
I think your problem is in red. Although true, it doesn’t apply to the power grid.

What you need to make things happen is a certain amount of power, so, for a given (constant) amount of power:
$$P=VI$$
You can either have high ##V## and low ##I##, or vice versa.

Because of ##P={I^2}R## you’re better off with low ##I##.
 
Guineafowl said:
You can either have high ##V## and low ##I##, or vice versa.

Because of ##P={I^2}R## you’re better off with low ##I##.
I think a more cogent argument is the following.
Define ##V_0## to be the voltage at the generator, ##R_W## the resistance of the delivery wiring and ##I## the current flowing in that wiring. Then the power that can be delivered to a load is:$$P_L=V_L\,I=\left(V_0-IR_W\right)I=V_0\,I-I^2 R_W\tag{1}$$This power vanishes at the two limiting currents ##0## and ##V_0/R_W## and so it must be maximized for some ##I## between those values. Thus:$$0=\frac{dP_{L}}{dI}=V_{0}-2IR_{W}\:\Rightarrow\:I_{max\,pow}=\frac{V_{0}}{2R_{W}}\tag{2}$$Using this in eq.(1) gives then:$$P_{L\:max}=\frac{V_{0}^{2}}{4R_{W}}\tag{3}$$Thus, maximizing the generator voltage ##V_0## always maximizes the delivered power ##P_L##.
 
  • #10
renormalize said:
I think a more cogent argument is the following.
Define ##V_0## to be the voltage at the generator, ##R_W## the resistance of the delivery wiring and ##I## the current flowing in that wiring. Then the power that can be delivered to a load is:$$P_L=V_L\,I=\left(V_0-IR_W\right)I=V_0\,I-I^2 R_W\tag{1}$$This power vanishes at the two limiting currents ##0## and ##V_0/R_W## and so it must be maximized for some ##I## between those values. Thus:$$0=\frac{dP_{L}}{dI}=V_{0}-2IR_{W}\:\Rightarrow\:I_{max\,pow}=\frac{V_{0}}{2R_{W}}\tag{2}$$Using this in eq.(1) gives then:$$P_{L\:max}=\frac{V_{0}^{2}}{4R_{W}}\tag{3}$$Thus, maximizing the generator voltage ##V_0## always maximizes the delivered power ##P_L##.
But as others have said ##R_W## isn't necessarily a constant.

I don't think maximum power delivery is the only metric to optimize, this leads to very inefficient solutions. Other things a utility may value:
1) Infrastructure costs.
2) Operating costs.
3) Power quality.
4) Reliability.

Several of these strongly favor a low loss system, if they can afford to build it.
 
  • #11
DaveE said:
Several of these strongly favor a low loss system, if they can afford to build it.
Sure, but the OP specifically asked:
EN1986 said:
I am trying to understand how transferring electric from the powerplant to my house is more effective using high voltage.
That's the basic question I aimed to answer in my post #9.
 
  • #12
renormalize said:
Sure, but the OP specifically asked:
EN1986 said:
I am trying to understand how transferring electric from the powerplant to my house is more effective using high voltage.
He asked about "more effective", not just more. Your example is good, but maximum power for a given line isn't really the point. Utilities simply don't want to operate in this very lossy region with poor load regulation.

I interpret this more like minimum losses (cost, really) for a given user load. I think as @Rive said, it's more about transformers (and equivalents) in distribution. You could also approach this solution as higher voltage requires less copper for the same efficiency. In this, overly simplistic, model there is no optimum voltage. Higher voltage always results in lower resistive losses without limit.
 
  • #13
The poor OP is likely thoroughly confused by now. The end goal always is to have a steady unchanging voltage where the end user utilizes it. Whatever the user loads on this voltage source will draw the current as needed while the supply voltage stays steady. So with steady voltage, current and watts delivered are proportional. The idea is to have as little loss as possible in the non perfect wires throughout the distribution system. Using transformers as previously described accomplishes this. I'd say this is simply a case of the OP mis-applying ohms law.
 
  • #14
Averagesupernova said:
The poor OP is likely thoroughly confused by now.
The OP has not been back since posting the question.
 
  • Like
Likes Fisherman199
  • #15
Baluncore said:
The OP has not been back since posting the question.
No surprise there...
 
  • #16
EN1986 said:
I am trying to understand how transferring electric from the powerplant to my house is more effective using high voltage.

The suggested explanation that the current is equal to the power supply divided by the voltage, and hence higher voltage leads to lower current and as a result to a lower power loss on the conductives is very confusing me.

I know that the current is determined by the voltage and the resistance, and not by a power capability - which defines a limit to the allowable consumption. And according to this a higher voltage (while assuming a constand resistance that comprises of the line resistance and the load resistance) leads to a higher current that results in a higher power loss.

What am I missing?
I believe, your missing the load to convert transmission voltage down to local substation voltage & then consumer load on the substation.
 
  • #17
Averagesupernova said:
The poor OP is likely thoroughly confused by now.
Likely right. So - ...
...there are multiple facets of this 'higher voltage' thing.

One is, the transmission. Somewhere around the generators there will be a transformer, which will elevate the voltage to transmission level to be fed into the grid with lower current (= > lower losses) for transmission. Somewhere around the consumers there will be another transformer which brings it back to consumer level voltage, and will be able to supply high currents. The voltage, current and apparent impedance are all different at different sections.

One other is on the the consumption side. At higher voltage less current (thus: less copper) is needed for a required power. This part is about elevating the 'nominal' voltage from the outlet, thus: consumer equipment needs to be ready for the change. In Europe, this was the 220V to 230V change, with equipment already required to be ready for up to 240V. Higher voltage is more efficient, but change requires time and old equipment needs to be phased out. For this part, equipment is designed to take what's needed at nominal voltage, so at higher voltage you need less current for the same 1kW motor, for example. It's not the same for old 220V and for new 230V, but a new equipment => they will indeed eat up less, despite Ohm's laws: because they were designed so.

Third is the short term stability of the grid. The actual voltage is not 230V, but fluctuating around it. At 235V, resistive loads will take more power than nominal, while at 225V they take less => if there is momentary overgeneration on the grid, then the voltage will rise and through the higher current consumption will just take it: if there is undergeneration, the consumption will graciously eat up less current without any intervention.
Though there are equipment (anything with electronics power supplies) which will consume at constant power, thus requiring less current at higher voltage and more at lower voltage - kind of opposite to regular resistive loads. They work against this shelf-regulation of the grid, so too many of them is a problem.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
609
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 8 ·
Replies
8
Views
6K
Replies
8
Views
3K