Increased current at a home dropping neighbors voltage?

  1. At this old thread:
    https://www.physicsforums.com/showthread.php?t=362716

    the reply:
    Now, whether this extra current matters depends on why the power supply voltage is dropping. If it is due to excessive current, then yes, increased current in one house could cause a slight drop in voltage for the other houses.


    Has me curious as to whether this can be a real scenario.
    Is it possible that if a neighbor uses a greater deal of power than your normal neighbor, that your voltage could drop?
    How can that affect your service?
    Lets say your whole home voltage level drops to 112 volts. Now all currents rise, correct?
    And this could possibly cause breakers to trip that normally wouldn't have tripped with the correct/higher voltage?

    Just curious as to what the concerns would be if a neighbor decided to open shop at his home and begins to utilize more power.
     
  2. jcsd
  3. FOIWATER

    FOIWATER 391
    Gold Member

    Let me give you an example. Let's say you have a substation at some point. Some kilometers away from the substation, you have a distribution transformer OK?

    Let's say this one distribution transformer is feeding a few homes in parallel.

    I think you would agree all power drawn by the houses plus the transformer losses is drawn to the primary of the dist. transformer.

    There is a line resistance/reactance from the sub to the dist. transformer. that causes a voltage drop proportional to the current drawn by the residencies.

    If one residence increases its load substantially, there is a higher voltage drop on the line. That reduces the distribution transformers secondary voltage and therefore the other two houses experience a voltage drop incoming due to the residence that increased its load

    Make any sense or
     
  4. FOIWATER

    FOIWATER 391
    Gold Member

    No the current will not rise.

    That is a misconception about P=EI The power is not fixed in that equation. The only fixed quantity for P,V,I,R is R (which is not really fixed but is temperature dependent but assume it is).

    The voltage applied to a fixed resistance determines the current draw, the current draw times the voltage determines the power draw. If you decrease the voltage to a fixed resistance, you reduce the current and therefore the power draw.

    Imagine applying 0.000000000005mV to a 60Watt light bulb. Do you think the light bulb will attempt to draw that much current from the source? No it will not draw much at all

    This is an important concept give it an honest read! :)
     
    1 person likes this.
  5. I love this comment you made:

    If you decrease the voltage to a fixed resistance, you reduce the current and therefore the power draw.

    I have always looked at how something draws power. That's where I think I assumed current would rise when voltage dropped. My workings with transformers has gotten me thinking that way.

    Your comment made me realize that the way I should look at it is that;
    it's not about how a load draws power. When a circuit to a load completes/closes, the gates are open for power to enter. Voltage still has to push the currents way through, thus less current as voltage drops.

    Thank you for that reminder on how things actually work. I had gotten into the habit of thinking about power draw.

    Kudos!
     
  6. If the distribution network of your electric utility is somewhat sensible (most are, but it depends on where you live, of course), then your neighbor's power consumption shouldn't affect your supply voltage in any significant way.

    The part of the grid that might feed you and your neighbors in parallel is itself fed by a network at much, much higher voltages than what is available to you. The increase in current in this part of the network due to any shift in power consumption at your end is usually completely negligible.
     
    1 person likes this.
  7. sophiecentaur

    sophiecentaur 13,571
    Science Advisor
    Gold Member

    It would seem that you have had a light bulb moment. You got cause and effect the right way round now. :smile:
    Alles ist klar.

    I dislike the expression "Current Draw"; it is a piece of shorthand that assumes a particular supply voltage - so it is often used by 'single voltage' workers. The only time it can properly apply is if you actually have a constant current source and it produces the appropriate volts across it (whatever you hang it on) to force the value of "Current Draw". Somehow, "Power Consumption" doesn't sound so bad because it's more obviously to do with domestic electrical appliances.
     
    1 person likes this.
  8. Houses also have motors. A low voltage can cause increased current flow in motor applications. If the motor is at stall or start current too long it can trip circuit breakers. Also if the motor is running less than rated speed (lower voltage causes increased slip in induction motors) the induced back EMF will be reduced causing an increase in current flow.
     
  9. NascentOxygen

    Staff: Mentor

    Yes, it is possible that a subtantial drop in line voltage could see a substantial rise in current, with consequent breaker tripping. Motors don't like brown-outs!
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?

0
Draft saved Draft deleted