Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Increased current at a home dropping neighbors voltage?

  1. Aug 31, 2014 #1
    At this old thread:

    the reply:
    Now, whether this extra current matters depends on why the power supply voltage is dropping. If it is due to excessive current, then yes, increased current in one house could cause a slight drop in voltage for the other houses.

    Has me curious as to whether this can be a real scenario.
    Is it possible that if a neighbor uses a greater deal of power than your normal neighbor, that your voltage could drop?
    How can that affect your service?
    Lets say your whole home voltage level drops to 112 volts. Now all currents rise, correct?
    And this could possibly cause breakers to trip that normally wouldn't have tripped with the correct/higher voltage?

    Just curious as to what the concerns would be if a neighbor decided to open shop at his home and begins to utilize more power.
  2. jcsd
  3. Aug 31, 2014 #2


    User Avatar
    Gold Member

    Let me give you an example. Let's say you have a substation at some point. Some kilometers away from the substation, you have a distribution transformer OK?

    Let's say this one distribution transformer is feeding a few homes in parallel.

    I think you would agree all power drawn by the houses plus the transformer losses is drawn to the primary of the dist. transformer.

    There is a line resistance/reactance from the sub to the dist. transformer. that causes a voltage drop proportional to the current drawn by the residencies.

    If one residence increases its load substantially, there is a higher voltage drop on the line. That reduces the distribution transformers secondary voltage and therefore the other two houses experience a voltage drop incoming due to the residence that increased its load

    Make any sense or
  4. Aug 31, 2014 #3


    User Avatar
    Gold Member

    No the current will not rise.

    That is a misconception about P=EI The power is not fixed in that equation. The only fixed quantity for P,V,I,R is R (which is not really fixed but is temperature dependent but assume it is).

    The voltage applied to a fixed resistance determines the current draw, the current draw times the voltage determines the power draw. If you decrease the voltage to a fixed resistance, you reduce the current and therefore the power draw.

    Imagine applying 0.000000000005mV to a 60Watt light bulb. Do you think the light bulb will attempt to draw that much current from the source? No it will not draw much at all

    This is an important concept give it an honest read! :)
  5. Aug 31, 2014 #4
    I love this comment you made:

    If you decrease the voltage to a fixed resistance, you reduce the current and therefore the power draw.

    I have always looked at how something draws power. That's where I think I assumed current would rise when voltage dropped. My workings with transformers has gotten me thinking that way.

    Your comment made me realize that the way I should look at it is that;
    it's not about how a load draws power. When a circuit to a load completes/closes, the gates are open for power to enter. Voltage still has to push the currents way through, thus less current as voltage drops.

    Thank you for that reminder on how things actually work. I had gotten into the habit of thinking about power draw.

  6. Aug 31, 2014 #5
    If the distribution network of your electric utility is somewhat sensible (most are, but it depends on where you live, of course), then your neighbor's power consumption shouldn't affect your supply voltage in any significant way.

    The part of the grid that might feed you and your neighbors in parallel is itself fed by a network at much, much higher voltages than what is available to you. The increase in current in this part of the network due to any shift in power consumption at your end is usually completely negligible.
  7. Aug 31, 2014 #6


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    It would seem that you have had a light bulb moment. You got cause and effect the right way round now. :smile:
    Alles ist klar.

    I dislike the expression "Current Draw"; it is a piece of shorthand that assumes a particular supply voltage - so it is often used by 'single voltage' workers. The only time it can properly apply is if you actually have a constant current source and it produces the appropriate volts across it (whatever you hang it on) to force the value of "Current Draw". Somehow, "Power Consumption" doesn't sound so bad because it's more obviously to do with domestic electrical appliances.
  8. Sep 1, 2014 #7
    Houses also have motors. A low voltage can cause increased current flow in motor applications. If the motor is at stall or start current too long it can trip circuit breakers. Also if the motor is running less than rated speed (lower voltage causes increased slip in induction motors) the induced back EMF will be reduced causing an increase in current flow.
  9. Sep 1, 2014 #8


    User Avatar

    Staff: Mentor

    Yes, it is possible that a subtantial drop in line voltage could see a substantial rise in current, with consequent breaker tripping. Motors don't like brown-outs!
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook