Energy Loss vs Energy Delivered and Voltage Drop - Confused

In summary, a 1 MW load is supplied by a 10kV feeder, but the current drawn is only 100A, due to the cable resistance of 1 ohms. The line losses are 10 kW, and the Voltage Drop is 100 V. The power supplied is actually less than what is actually delivered, due to the losses in the transmission system. The efficiency of the system is only 98% due to the losses in the system.
  • #1
mathological
10
0
Hi,

ok so everytime I think that I have understood the concept of Energy transmission, losses and voltage drop, I get even more confused about things.

I have searched several threads on this forum and the physics forum but failed to find anything that directly answers my query. So I kindly request your help on the fundamental concept.

I will try to give an example and I would like you guys to correct me where I am wrong -- Thanks in advance!
-------------------------------------------------------------------------
If a 1 MW load is to be supplied by a 10kV feeder, and the cable resistance is 1 ohms, then the current drawn by the load (or the current traveling on the cable) is:

I = P/V = 100 A

The line losses are I^2*R = 10 kW and the Voltage Drop = I*R = 100 V

Is it right to say that:

a) Voltage at supply point is 10kV
b) Voltage at consumer end is 9.9kV (10kV - 100V)
c) The current through the cable is 100A
d) The power required is 1 MW but the actual power being delivered is 0.99 MW [ (P - losses) OR (V*I = 9.9kV * 100A) ]
e) hence the efficiency of this system is 0.99MW/1MW * 100 = 99% efficient?
------------------------------------------------------------------------------
Please correct me where I am wrong.

Thank you.
 
Engineering news on Phys.org
  • #2
All looks right to me.

Fish
 
  • #3
nevermind... looks good
 
  • #4
@mathological
It's near enough OK but, to be accurate, you need to know the actual resistance of the load (assume it's V2/Nominal Power where V is the nominal operating voltage) - that would imply 100Ω load resistance.
You have assumed a cable resistance of 1Ω which would be in series with a load of 100Ω.
The voltage drop across the feeder will be 10kV*1/101.
This gives a voltage across the load of 10kV/1.01 which gives about 98% of the power (proportional to the square of the volts). So it's less efficient than you would initially have thought. There's a double whammy in there! Weird huh.

Edit - Tidied up the bit about the volts on the load - no change in info- just removed a duplication
 
Last edited:
  • #5
what if the load was hooked up in parallel?
 
  • #6
In parallel with what??
You wouldn't connect the cables straight across the mains!
 
  • #7
true
 
  • #8
sophiecentaur said:
@mathological
It's near enough OK but, to be accurate, you need to know the actual resistance of the load (assume it's V2/Nominal Power where V is the nominal operating voltage) - that would imply 100Ω load resistance.
You have assumed a cable resistance of 1Ω which would be in series with a load of 100Ω.
The voltage drop across the feeder will be 10kV*1/101.
This gives a voltage across the load of 10kV/1.01 which gives about 98% of the power (proportional to the square of the volts). So it's less efficient than you would initially have thought. There's a double whammy in there! Weird huh.

Edit - Tidied up the bit about the volts on the load - no change in info- just removed a duplication

You have assumed a cable resistance of 1Ω which would be in series with a load of 100Ω.
The voltage drop across the feeder will be 10kV*1/101. Are you using voltage division here?

This gives a voltage across the load of 10kV/1.01 which gives about 98% of the power (proportional to the square of the volts). Sorry I don't get how you calculated this...can you please elaborate? Thanks! :biggrin:
 
  • #9
Look, mathlogical, you have stipulated that your load dissipates 1MW. Now. you have to be clear whether this 10KV pair of wires you call a feeder has 10KV at the source or 10KV at the load. The voltage at the source is not the same as the voltage at the load.

Pick one or the other and the answer is determinate.
 
  • #10
@mathological
Yes, I am using voltage division (it's the same old potential divider circuit that we come across everywhere).

I was assuming that the original "1MW load" was one which is designed to dissipate 1MW when presented with 10Kv. Few loads adjust themselves to take their specified power; they are mostly 'dumb' resistors, or equivalent.
The power dissipated in a resistor is V2/R so the power you get at the end of the cable will be (1/1.01)2 (=0.98) as much as without the cable.
Actually, to be fair, the supply (generator) would be delivering less current (also 1/1.01 as much) so, defining efficiency as:
power out/power supplied
the actual efficiency will not be as low as 98% because less actual power will be put into the system by the generator. But you still get only 98% of the power you wanted.
 
Last edited:
  • #11
@sophiecentaur

Thanks for the clarification! :)
 

1. What is the difference between energy loss and energy delivered?

Energy loss refers to the amount of energy that is dissipated or wasted during the transmission or distribution of electricity. This can be due to factors such as resistance in the wires or equipment, and it results in a decrease in the amount of energy that is actually delivered to the intended destination. Energy delivered, on the other hand, is the amount of energy that actually reaches the end user and can be used for its intended purpose.

2. How does voltage drop affect energy loss and energy delivered?

Voltage drop is a phenomenon that occurs when electricity travels through a wire or conductor, resulting in a decrease in voltage at the end destination. This decrease in voltage can lead to a decrease in energy delivered, as the lower voltage means less energy is available for use. Additionally, voltage drop can also contribute to energy loss, as it creates resistance and can cause heat loss in the wires or equipment.

3. What causes energy loss and voltage drop?

Energy loss and voltage drop can be caused by a variety of factors, including the length and thickness of the wires, the quality of the materials used, and the load or demand on the system. Other factors such as temperature, humidity, and magnetic fields can also contribute to energy loss and voltage drop.

4. How can we minimize energy loss and voltage drop?

There are several ways to minimize energy loss and voltage drop, including using higher quality materials with lower resistance, reducing the length of the wires, and properly maintaining and upgrading equipment. Additionally, implementing energy-efficient practices and technologies can also help to reduce energy loss and voltage drop.

5. What are the implications of energy loss and voltage drop for energy efficiency?

Energy loss and voltage drop can have a significant impact on energy efficiency, as they can result in a decrease in the amount of energy that is actually delivered and used for its intended purpose. This not only leads to wasted energy, but it can also result in higher energy bills and a strain on the overall electricity grid. Therefore, it is important to address and minimize these issues in order to improve energy efficiency.

Similar threads

  • Electrical Engineering
Replies
5
Views
1K
Replies
8
Views
1K
  • Introductory Physics Homework Help
Replies
9
Views
1K
  • Electrical Engineering
Replies
25
Views
2K
  • Electrical Engineering
Replies
26
Views
4K
  • Electrical Engineering
Replies
8
Views
1K
Replies
8
Views
972
  • Electrical Engineering
Replies
10
Views
5K
  • Electrical Engineering
Replies
10
Views
907
Replies
4
Views
2K
Back
Top