Low load factor and Voltage drops

In summary, the conversation discusses the relationship between load factor and system losses in a distribution transformer. The lecturer mentions how a low load factor leads to higher system losses and voltage drops, which is caused by the effect of reactive current on the series impedance of the power system. This results in a worse power factor and a more widely varying load. The definition of load factor varies among different sources, some defining it as the average load over the peak load and others defining it as the current load over the rated load.
  • #1
rozan977
9
0
My lecturer, in the notes he provided to us mentioned that low load factor of a distribution transformer results in higher system losses and voltage drops. I thought the by higher system losses he means the iron losses in transformer which is comparably higher in no-load and lower load conditions. But I just couldn't figure out how low load factor causes voltage drops in system?
 
Engineering news on Phys.org
  • #2
He's probably referring to the effect of reactive current causing comparatively much more voltage drop in upstream reactive impedance.

Because the series impedance of the power system is primarily inductive, then the voltage drops due to reactive current tends to subtract directly, whereas those from real current components add vectorally at right angles.

For example, say you had a load taking 1.0 pu reactive current and the series impedance was j0.1 pu. The required generator voltage is 1.1 pu (the voltage drop is in phase so adds algebraically). Now consider the same case but with 1.0 pu real current being supplied. The required generator voltage is now only 1.005 pu (volt drop adds at 90 degrees).
 
  • #3
uart, I think you are confusing 'Load factor' with 'power factor'.
@rozan977, I think the lecturer is trying to explain what would happen in a distribution system already in existence if the Load factor goes low. In such case, assuming same average monthly consumption, a low load factor means, high currents for shorter period of time, (as compared to the currents being distributed in time, in case of higher load factor).
Higher currents obviously results in higher voltage drop (during the high current period). If we do the maths, we can also see that, when currents are higher for short period and less for longer period (instead of evenly spreading out), the power loss in the line will be higher.
for example, For a building, the current is 10A for whole 24 hours (load factor of 1)
Then its daily energy Loss = 10^2*R*24
In another building, the current is 20A for 12hrs and 0A for 12hrs (same average daily energy). This gives load factor of 0.5.
Then its daily energy loss = 20^2*R*12, which is double of the previous case.
 
  • #4
I_am_learning said:
uart, I think you are confusing 'Load factor' with 'power factor'.

I think the lecturer is trying to explain what would happen in a distribution system already in existence if the Load factor goes low. In such case, assuming same average monthly consumption, a low load factor means, high currents for shorter period of time, (as compared to the currents being distributed in time, in case of higher load factor).

No, actually I think you're confusing load factor with something like duty factor or duty cycle (the ratio of average to peak).

The load factor of a transformer (at any given point in time) is simply the operating power (at that time) as a ratio or percentage of it's rated power. Since the magnetizing reactance draws approximately constant reactive current (independent of load) then a low load factor implies a worse power factor. So no, I'm not confusing them, they are in fact related.
 
Last edited:
  • #5
In the book "ELECTRIC POWER DISTRIBUTION
EQUIPMENT AND SYSTEMS", Author T.A. Short says
T.A. Short said:
• Load factor — The ratio of the average load over the peak load. Peak
load is normally the maximum demand but may be the instantaneous
peak. The load factor is between zero and one. A load factor
close to 1.0 indicates that the load runs almost constantly. A low load
factor indicates a more widely varying load. From the utility point
of view, it is better to have high load-factor loads. Load factor is
normally found from the total energy used (kilowatt-hours) as:
LF = kWh/(dkW*h)
where
LF = load factor
kWh = energy use in kilowatt-hours
dkW = peak demand in kilowatts
h = number of hours during the time period
 
  • #6
The definition I've got here is:

Transformer Load Factor : Current load / rated load.

Edit. Just googled a few papers on the subject and some authors are using the definition of "current power/rated power" while others are using "average/peak". So yeah, take your pick. :cry:I guess the op will have to let use know what particular definition of "load factor" he's using.
 
Last edited:

Related to Low load factor and Voltage drops

1. What is low load factor?

Low load factor refers to the ratio of the average power used by a system to the maximum power that can be used by the system. It is usually expressed as a percentage and is an indicator of how efficiently a system is being utilized. A low load factor means that the system is not using its full capacity and is operating at a lower efficiency.

2. What causes low load factor?

Low load factor can be caused by a variety of factors, including inefficient equipment or processes, overcapacity of a system, or irregular usage patterns. It can also be influenced by external factors such as weather or seasonal changes.

3. How does low load factor affect voltage drops?

Low load factor can lead to higher voltage drops in a system. This is because when a system is operating at a low load, there is less demand for power, causing the voltage to increase. This increase in voltage can lead to voltage drops, which can cause damage to equipment and affect the overall efficiency of the system.

4. What are the consequences of voltage drops?

Voltage drops can have serious consequences, including damage to equipment, reduced efficiency, and even power outages. When voltage drops occur, it can cause equipment to overheat, leading to malfunctions and potentially costly repairs. In addition, voltage drops can result in power fluctuations, which can disrupt operations and cause downtime.

5. How can low load factor and voltage drops be prevented?

To prevent low load factor and voltage drops, it is important to regularly monitor and maintain equipment to ensure it is operating efficiently. Additionally, implementing energy-efficient practices and avoiding overcapacity can also help prevent these issues. It is also recommended to conduct regular load testing to identify and address any potential issues before they become major problems.

Similar threads

  • Electrical Engineering
Replies
25
Views
2K
  • Electrical Engineering
Replies
10
Views
2K
Replies
8
Views
1K
  • Electrical Engineering
Replies
3
Views
1K
Replies
37
Views
7K
  • Electrical Engineering
Replies
6
Views
2K
  • Electrical Engineering
Replies
10
Views
5K
Replies
64
Views
5K
Replies
1
Views
1K
  • Electrical Engineering
Replies
21
Views
4K
Back
Top