Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Low load factor and Voltage drops

  1. Aug 23, 2012 #1
    My lecturer, in the notes he provided to us mentioned that low load factor of a distribution transformer results in higher system losses and voltage drops. I thought the by higher system losses he means the iron losses in transformer which is comparably higher in no-load and lower load conditions. But I just couldn't figure out how low load factor causes voltage drops in system???
  2. jcsd
  3. Aug 24, 2012 #2


    User Avatar
    Science Advisor

    He's probably referring to the effect of reactive current causing comparatively much more voltage drop in upstream reactive impedance.

    Because the series impedance of the power system is primarily inductive, then the voltage drops due to reactive current tends to subtract directly, whereas those from real current components add vectorally at right angles.

    For example, say you had a load taking 1.0 pu reactive current and the series impedance was j0.1 pu. The required generator voltage is 1.1 pu (the voltage drop is in phase so adds algebraically). Now consider the same case but with 1.0 pu real current being supplied. The required generator voltage is now only 1.005 pu (volt drop adds at 90 degrees).
  4. Aug 25, 2012 #3
    uart, I think you are confusing 'Load factor' with 'power factor'.
    @rozan977, I think the lecturer is trying to explain what would happen in a distribution system already in existence if the Load factor goes low. In such case, assuming same average monthly consumption, a low load factor means, high currents for shorter period of time, (as compared to the currents being distributed in time, in case of higher load factor).
    Higher currents obviously results in higher voltage drop (during the high current period). If we do the maths, we can also see that, when currents are higher for short period and less for longer period (instead of evenly spreading out), the power loss in the line will be higher.
    for example, For a building, the current is 10A for whole 24 hours (load factor of 1)
    Then its daily energy Loss = 10^2*R*24
    In another building, the current is 20A for 12hrs and 0A for 12hrs (same average daily energy). This gives load factor of 0.5.
    Then its daily energy loss = 20^2*R*12, which is double of the previous case.
  5. Aug 25, 2012 #4


    User Avatar
    Science Advisor

    No, actually I think you're confusing load factor with something like duty factor or duty cycle (the ratio of average to peak).

    The load factor of a transformer (at any given point in time) is simply the operating power (at that time) as a ratio or percentage of it's rated power. Since the magnetizing reactance draws approximately constant reactive current (independent of load) then a low load factor implies a worse power factor. So no, I'm not confusing them, they are in fact related.
    Last edited: Aug 25, 2012
  6. Aug 25, 2012 #5
    EQUIPMENT AND SYSTEMS", Author T.A. Short says
  7. Aug 25, 2012 #6


    User Avatar
    Science Advisor

    The definition I've got here is:

    Transformer Load Factor : Current load / rated load.

    Edit. Just googled a few papers on the subject and some authors are using the definition of "current power/rated power" while others are using "average/peak". So yeah, take your pick. :cry:

    I guess the op will have to let use know what particular definition of "load factor" he's using.
    Last edited: Aug 25, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Low load factor and Voltage drops
  1. Load Factor (Replies: 1)