Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Home AC supply - why live, neutral and ground needed?

  1. Oct 27, 2013 #1
    I have checked How Stuff Work, About.com, Ask.com, Google and even asked an electrical engineer and found no satisfying answer to the following questions:
    For household AC current, why do we need 2 live, a neutral and a ground 4-wire connection? We get 3 wires from the street transformer where the Secondary coming towards home has a middle tap. The phase difference from either one end to the middle tap of the Secondary is 120 volts, and end-to-end 220 volts. We use one of the single phases for the light bulbs and toasters (110V) etc. and we use both phases for the Central Air Conditioner (220V).
    If you hookup a light bulb only with the live wire and use the ground instead of the neutral wire, it still lights up. I know power companies use the planet's ground as a wire to return electrons.
    The middle tap for the 2-phase Secondary which is coming to the home can be grounded, can it not?
    Since the neutral wire and the ground wire both are used for returning the electrons, why don't we just use the live wire and the ground wire and no neutral wire for single phase appliances?
    Why do we need the separate neutral wire for a single phase 120 volt light bulb?
     
    Last edited: Oct 27, 2013
  2. jcsd
  3. Oct 27, 2013 #2

    meBigGuy

    User Avatar
    Gold Member

    Roughly speaking the hot and neutral are the main electrical connections and the ground is for safety. What you don't actually need electrically is the ground.

    When you conduct from hot to ground the current needs to find its way back to neutral. If it doesn't do so well, and you happen to be barefoot and providing a better conduction path, it will conduct through you.

    Google "need for neutral and ground" and read what comes up.
     
  4. Oct 27, 2013 #3

    dlgoff

    User Avatar
    Science Advisor
    Gold Member

  5. Oct 27, 2013 #4

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    There is no logical reason why you "need" 4 wires, and many other countries have different wiring systems with only two (for appliances that are constructed so they are guaranteed to be insulated) or three (two plus a safety ground).

    The basic answer is "because that's the way the USA decided to wire electricity to houses". The other posts explain what the system is, but in the final analysis "there is no reason why".
     
  6. Oct 28, 2013 #5
    Thanks for the explanation. There must be a good reason why the US decided to distribute AC that way. Does any one know why it is done this way?
     
    Last edited: Oct 28, 2013
  7. Oct 28, 2013 #6

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    It was all due to history and some fairly arbitrary choices. My theory is that the US chose a basic voltage for their supply that was lower and significantly safer (i.e. half) than the European choice. This was fine when the loads were lighting, a fridge and a radio set. Once people started to need more powerful appliances, they found the current demand was too high for a supply of around 100V and so they provided the option for twice that. Rather than change all the light bulbs etc. , they had to provide homes with the choice of around 100V and around 200V for new equipment. Many / most(?) people get their own supply transformer.
    The Europeans chose around 200V to start with and that is enough for a house full of electrical appliances. Any really heavy loads get a three phase supply. Very few users get their own transformer.
     
  8. Oct 28, 2013 #7

    Borek

    User Avatar

    Staff: Mentor

    Yep - I have 24 kW water heater that uses three phases. And 24 kW does qualify as a heavy load.
     
  9. Oct 28, 2013 #8

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    And they say size doesn't matter!
     
  10. Oct 28, 2013 #9
    The reason we have what we have in the USA goes back to Edison and DC. 100 volts was chosen and it was set up like our split phase AC is. The reason was because the currents cancel in what we refer to as neutral reducing voltage drop along the line. The voltage was increased to 110 volts to help dim lights at the end of the line. It was not enough of an overvoltage to cause trouble at the start of the line. So we have always had the low (100) and high (200) volts. It was not because we started to require more power. It was adopted for AC as well.
     
  11. Oct 29, 2013 #10

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    But what could have been the reason for wanting a dual standard of supply voltage? Engineers have always gone for single standards, wherever possible, in most fields. What is different about electrical supply? There are two standards of supply voltage in automotive engineering for very good reasons - 24V batteries are more expensive. for a given energy storage capacity but 12V starter motors for large engines require too much current for cables, if 12V is used. A very good reason, i should say but there are huge knock-ons for the owner of a 'large' vehicle who wants to replace some generic part and finds it is costing an arm and a leg compared with the 12V equivalent. This EE forum is full of comments about it.

    Have you any historical evidence that the original supplies were dual voltage? What use would the extra 220V supply have been to the average house with a power demand of a couple of kW? A direct question has been asked about this- so we should come up with an answer, if we can. When Electricity was in its early days, I could imagine terrific reaction against totally re-equiping with new light bulbs and house wiring so a bolt on system would have been attractive. Think of the expense of changing all that, in order to instal a high powered electrical heater (plus the need for three wires to the house, rather than two). People would just stick with gas or (very cheap) oil. How many years have US supplies be dual voltage?

    That US supply system gives serious problems for 'the student' who tries to think of it as a 'two phase' system and wants to relate it to the 'three phase' system, used for serious electrical supplies.
     
  12. Oct 29, 2013 #11
    I guess I need to be more clear. The dual voltage system was used to help avoid voltage drop. Two batteries of 100 volts each were wired in series. The lights were run on 100 volts. The neutral wire (the one connected to the node where the batteries are connected together) didn't carry any current when the currents from each battery were balanced. This obviously helped prevent voltage drop compared to running the same number of lights from a single 100 volt source. Why 200 volt bulbs were not used could be due to reliability as well as safety. Face it, most people don't get shocked by 220 volts in their home in the U.S. Nor would anyone likely have gotten shocked by more than 100 volts in the Edison system. It wasn't more than a few years ago that I read about places on the east coast of the U.S. were still served by DC. I'm am sure it has been switched over by now.
     
  13. Oct 29, 2013 #12

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    Oh boy. Was that really a system? It sounds crazy. How did they select which lights to connect which side? It would be a nightmare system to manage. Do you have any details (an engineering link)? The US is always full of surprises for me. Men on the Moon, forty years ago but DC mains supplies until just recently. I guess, if they were to hang on for another decade or so, they could stay with DC and use the very highest tech with low losses.
     
  14. Oct 29, 2013 #13
    It really isn't that much different than split phase systems within buildings in the U.S. right now. What things get connected to which side? I suspect maybe every other light was divided up between sides. Something like this. This most likely started with street lights and was later added onto.
     
  15. Jan 1, 2016 #14

    jim hardy

    User Avatar
    Science Advisor
    Gold Member

    Looking at dlgoff's graphic
    and replacing all the breakers with fuses like we had in 1950's

    failure of one of the fuses at "main breaker" doesn't put the house into complete darkness. Half the 120 volt lights and outlets still work.

    That's an ADVANTAGE of the split system - better reliability in the days when slide rules and fuses roamed the earth
    but was it one of the reasons? I don't know.
    Our electrical code suggests that every room have feeds from two different circuits so a single failure won't put that room into complete darkness. Extending that thought to the incoming power is certainly logical but i dont know if that was done.

    Lots of houses here still have fuses. I've helped friends who were stumped when half their lights glow dim and get brighter when they switch on the stove. A single open main fuse had put out half the lights and when water heater cooled down it passed enough current from the "other phase" to run them partial brightness. When fuses get sixty or so years old they seem more prone to open spuriously especially if there's been roof leaks around the panel and corrosion has set in..

    old jim
     
  16. Jan 1, 2016 #15

    anorlunda

    User Avatar
    Science Advisor
    Insights Author
    Gold Member

    Sophie's speculation about the historical motivations, does not account for the dates and the personalities. Edison's three wire scheme and choice of voltages evolved 1881-1882, the European BEW system came in 1899. By analogy, compare this with the NTSC (USA) versus PAL (Europe) TV standards. PAL is better, but it did not exist early enough for the Americans. If you have any technology that evolves in a way that demands backward compatibility, it becomes extremely hard to displace it with a better idea. How many people posting on this thread are typing on a non-QUERTY keyboard?

    In the 1970s an engineer in Russia calculated that 100 Hz is more optimum than 50 Hz or 60 Hz for power. So what are the chances that the world converts to this idea because it is better? Zero, nada, zip, null.

    [Note: the advantages of Edison's three wire system apply equally to AC or DC, so that is not part of the debate.]

    Here is an interesting but short version of the history

    An important factor not mentioned in that history was that Edison's business model was to supply illumination only. Non-illumination applications of electricity evolved soon thereafter, but Edison was slow to adapt and he resisted them.

    From an Edison Biography (Edison, His Life and Inventions, a book all engineers would love and available for free) I learned that Edison's engineering skills far outweighed his business skills. He went broke several times. His motivation for the three wire system was to save money on copper. Later, Edison invested in copper mines in Chile (the origins of Anaconda Copper) and he changed his mind. He thought that he could make more money selling copper than electricity. That, was his reason for opposing the Tesla/Westinghouse ideas. AC versus DC was only the proxy issue for public relations purposes. The true underlying dispute was about the business model for electric utilities.
     
  17. Jan 1, 2016 #16

    jim hardy

    User Avatar
    Science Advisor
    Gold Member

    see also Where lies AC efficiency?....

    In a three phase system third harmonics all return through the neutral
    rectifier power supplies produce a LOT of third harmonic
    especially early computers

    so when one takes an old office building that was wired with small neutrals
    and puts computer equipment in every nook and cranny
    he's apt to melt some of those neutrals.
    US NEC now requires neutral for computer installations to be same size as phase conductors.
    And there are regulations now for waveform and power factor correction for rectifier power supplies to both reduce RFI and harmonic production.

    That's how codes evolve - through mistakes.
     
  18. Jan 1, 2016 #17

    jim hardy

    User Avatar
    Science Advisor
    Gold Member

    If i'm not mistaken the first Niagara project was 20 hz. I recall seeing flicker in the incandescent bulbs as recently as 1970,

    The house where i grew up was built in 1949 and had two prong outlets, hot and neutral , no grounding prong.
    Fortunately the wiring was in metal conduit so in 60's Dad was able to upgrade to 3 prong , the metal conduit serving as earthing conductor.
     
  19. Jan 1, 2016 #18

    anorlunda

    User Avatar
    Science Advisor
    Insights Author
    Gold Member

    Pretty close Jim: 25 Hz. They continued to supply that until (I think) 1992 because the local factory motors that used that frequency had run 111 years with only a few drops of oil as their only maintenance (so goes the legend anyhow.) What a shame that they couldn't leave it alone to see if it would have lasted a millennium.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Home AC supply - why live, neutral and ground needed?
  1. Ground and neutral (Replies: 11)

Loading...