Home AC supply - why live, neutral and ground needed?

  1. I have checked How Stuff Work, About.com, Ask.com, Google and even asked an electrical engineer and found no satisfying answer to the following questions:
    For household AC current, why do we need 2 live, a neutral and a ground 4-wire connection? We get 3 wires from the street transformer where the Secondary coming towards home has a middle tap. The phase difference from either one end to the middle tap of the Secondary is 120 volts, and end-to-end 220 volts. We use one of the single phases for the light bulbs and toasters (110V) etc. and we use both phases for the Central Air Conditioner (220V).
    If you hookup a light bulb only with the live wire and use the ground instead of the neutral wire, it still lights up. I know power companies use the planet's ground as a wire to return electrons.
    The middle tap for the 2-phase Secondary which is coming to the home can be grounded, can it not?
    Since the neutral wire and the ground wire both are used for returning the electrons, why don't we just use the live wire and the ground wire and no neutral wire for single phase appliances?
    Why do we need the separate neutral wire for a single phase 120 volt light bulb?
    Last edited: Oct 27, 2013
  2. jcsd
  3. Roughly speaking the hot and neutral are the main electrical connections and the ground is for safety. What you don't actually need electrically is the ground.

    When you conduct from hot to ground the current needs to find its way back to neutral. If it doesn't do so well, and you happen to be barefoot and providing a better conduction path, it will conduct through you.

    Google "need for neutral and ground" and read what comes up.
  4. dlgoff

    dlgoff 3,158
    Science Advisor
    Gold Member
    2014 Award

  5. AlephZero

    AlephZero 7,248
    Science Advisor
    Homework Helper

    There is no logical reason why you "need" 4 wires, and many other countries have different wiring systems with only two (for appliances that are constructed so they are guaranteed to be insulated) or three (two plus a safety ground).

    The basic answer is "because that's the way the USA decided to wire electricity to houses". The other posts explain what the system is, but in the final analysis "there is no reason why".
  6. Thanks for the explanation. There must be a good reason why the US decided to distribute AC that way. Does any one know why it is done this way?
    Last edited: Oct 28, 2013
  7. sophiecentaur

    sophiecentaur 14,715
    Science Advisor
    Gold Member

    It was all due to history and some fairly arbitrary choices. My theory is that the US chose a basic voltage for their supply that was lower and significantly safer (i.e. half) than the European choice. This was fine when the loads were lighting, a fridge and a radio set. Once people started to need more powerful appliances, they found the current demand was too high for a supply of around 100V and so they provided the option for twice that. Rather than change all the light bulbs etc. , they had to provide homes with the choice of around 100V and around 200V for new equipment. Many / most(?) people get their own supply transformer.
    The Europeans chose around 200V to start with and that is enough for a house full of electrical appliances. Any really heavy loads get a three phase supply. Very few users get their own transformer.
  8. Borek

    Staff: Mentor

    Yep - I have 24 kW water heater that uses three phases. And 24 kW does qualify as a heavy load.
  9. sophiecentaur

    sophiecentaur 14,715
    Science Advisor
    Gold Member

    And they say size doesn't matter!
  10. The reason we have what we have in the USA goes back to Edison and DC. 100 volts was chosen and it was set up like our split phase AC is. The reason was because the currents cancel in what we refer to as neutral reducing voltage drop along the line. The voltage was increased to 110 volts to help dim lights at the end of the line. It was not enough of an overvoltage to cause trouble at the start of the line. So we have always had the low (100) and high (200) volts. It was not because we started to require more power. It was adopted for AC as well.
  11. sophiecentaur

    sophiecentaur 14,715
    Science Advisor
    Gold Member

    But what could have been the reason for wanting a dual standard of supply voltage? Engineers have always gone for single standards, wherever possible, in most fields. What is different about electrical supply? There are two standards of supply voltage in automotive engineering for very good reasons - 24V batteries are more expensive. for a given energy storage capacity but 12V starter motors for large engines require too much current for cables, if 12V is used. A very good reason, i should say but there are huge knock-ons for the owner of a 'large' vehicle who wants to replace some generic part and finds it is costing an arm and a leg compared with the 12V equivalent. This EE forum is full of comments about it.

    Have you any historical evidence that the original supplies were dual voltage? What use would the extra 220V supply have been to the average house with a power demand of a couple of kW? A direct question has been asked about this- so we should come up with an answer, if we can. When Electricity was in its early days, I could imagine terrific reaction against totally re-equiping with new light bulbs and house wiring so a bolt on system would have been attractive. Think of the expense of changing all that, in order to instal a high powered electrical heater (plus the need for three wires to the house, rather than two). People would just stick with gas or (very cheap) oil. How many years have US supplies be dual voltage?

    That US supply system gives serious problems for 'the student' who tries to think of it as a 'two phase' system and wants to relate it to the 'three phase' system, used for serious electrical supplies.
  12. I guess I need to be more clear. The dual voltage system was used to help avoid voltage drop. Two batteries of 100 volts each were wired in series. The lights were run on 100 volts. The neutral wire (the one connected to the node where the batteries are connected together) didn't carry any current when the currents from each battery were balanced. This obviously helped prevent voltage drop compared to running the same number of lights from a single 100 volt source. Why 200 volt bulbs were not used could be due to reliability as well as safety. Face it, most people don't get shocked by 220 volts in their home in the U.S. Nor would anyone likely have gotten shocked by more than 100 volts in the Edison system. It wasn't more than a few years ago that I read about places on the east coast of the U.S. were still served by DC. I'm am sure it has been switched over by now.
  13. sophiecentaur

    sophiecentaur 14,715
    Science Advisor
    Gold Member

    Oh boy. Was that really a system? It sounds crazy. How did they select which lights to connect which side? It would be a nightmare system to manage. Do you have any details (an engineering link)? The US is always full of surprises for me. Men on the Moon, forty years ago but DC mains supplies until just recently. I guess, if they were to hang on for another decade or so, they could stay with DC and use the very highest tech with low losses.
  14. It really isn't that much different than split phase systems within buildings in the U.S. right now. What things get connected to which side? I suspect maybe every other light was divided up between sides. Something like this. This most likely started with street lights and was later added onto.
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?

Draft saved Draft deleted