# Home AC supply - why live, neutral and ground needed?

For household AC current, why do we need 2 live, a neutral and a ground 4-wire connection? We get 3 wires from the street transformer where the Secondary coming towards home has a middle tap. The phase difference from either one end to the middle tap of the Secondary is 120 volts, and end-to-end 220 volts. We use one of the single phases for the light bulbs and toasters (110V) etc. and we use both phases for the Central Air Conditioner (220V).
If you hookup a light bulb only with the live wire and use the ground instead of the neutral wire, it still lights up. I know power companies use the planet's ground as a wire to return electrons.
The middle tap for the 2-phase Secondary which is coming to the home can be grounded, can it not?
Since the neutral wire and the ground wire both are used for returning the electrons, why don't we just use the live wire and the ground wire and no neutral wire for single phase appliances?
Why do we need the separate neutral wire for a single phase 120 volt light bulb?

Last edited:

Related Electrical Engineering News on Phys.org
meBigGuy
Gold Member
Roughly speaking the hot and neutral are the main electrical connections and the ground is for safety. What you don't actually need electrically is the ground.

When you conduct from hot to ground the current needs to find its way back to neutral. If it doesn't do so well, and you happen to be barefoot and providing a better conduction path, it will conduct through you.

AlephZero
Homework Helper
For household AC current, why do we need 2 live, a neutral and a ground 4-wire connection?
There is no logical reason why you "need" 4 wires, and many other countries have different wiring systems with only two (for appliances that are constructed so they are guaranteed to be insulated) or three (two plus a safety ground).

The basic answer is "because that's the way the USA decided to wire electricity to houses". The other posts explain what the system is, but in the final analysis "there is no reason why".

Thanks for the explanation. There must be a good reason why the US decided to distribute AC that way. Does any one know why it is done this way?

Last edited:
sophiecentaur
Gold Member
It was all due to history and some fairly arbitrary choices. My theory is that the US chose a basic voltage for their supply that was lower and significantly safer (i.e. half) than the European choice. This was fine when the loads were lighting, a fridge and a radio set. Once people started to need more powerful appliances, they found the current demand was too high for a supply of around 100V and so they provided the option for twice that. Rather than change all the light bulbs etc. , they had to provide homes with the choice of around 100V and around 200V for new equipment. Many / most(?) people get their own supply transformer.
The Europeans chose around 200V to start with and that is enough for a house full of electrical appliances. Any really heavy loads get a three phase supply. Very few users get their own transformer.

Borek
Mentor
The Europeans chose around 200V to start with and that is enough for a house full of electrical appliances. Any really heavy loads get a three phase supply. Very few users get their own transformer.
Yep - I have 24 kW water heater that uses three phases. And 24 kW does qualify as a heavy load.

sophiecentaur
Gold Member
And they say size doesn't matter!

Averagesupernova
Gold Member
The reason we have what we have in the USA goes back to Edison and DC. 100 volts was chosen and it was set up like our split phase AC is. The reason was because the currents cancel in what we refer to as neutral reducing voltage drop along the line. The voltage was increased to 110 volts to help dim lights at the end of the line. It was not enough of an overvoltage to cause trouble at the start of the line. So we have always had the low (100) and high (200) volts. It was not because we started to require more power. It was adopted for AC as well.

sophiecentaur
Gold Member
The reason we have what we have in the USA goes back to Edison and DC. 100 volts was chosen and it was set up like our split phase AC is. The reason was because the currents cancel in what we refer to as neutral reducing voltage drop along the line. The voltage was increased to 110 volts to help dim lights at the end of the line. It was not enough of an overvoltage to cause trouble at the start of the line. So we have always had the low (100) and high (200) volts. It was not because we started to require more power. It was adopted for AC as well.
But what could have been the reason for wanting a dual standard of supply voltage? Engineers have always gone for single standards, wherever possible, in most fields. What is different about electrical supply? There are two standards of supply voltage in automotive engineering for very good reasons - 24V batteries are more expensive. for a given energy storage capacity but 12V starter motors for large engines require too much current for cables, if 12V is used. A very good reason, i should say but there are huge knock-ons for the owner of a 'large' vehicle who wants to replace some generic part and finds it is costing an arm and a leg compared with the 12V equivalent. This EE forum is full of comments about it.

Have you any historical evidence that the original supplies were dual voltage? What use would the extra 220V supply have been to the average house with a power demand of a couple of kW? A direct question has been asked about this- so we should come up with an answer, if we can. When Electricity was in its early days, I could imagine terrific reaction against totally re-equiping with new light bulbs and house wiring so a bolt on system would have been attractive. Think of the expense of changing all that, in order to instal a high powered electrical heater (plus the need for three wires to the house, rather than two). People would just stick with gas or (very cheap) oil. How many years have US supplies be dual voltage?

That US supply system gives serious problems for 'the student' who tries to think of it as a 'two phase' system and wants to relate it to the 'three phase' system, used for serious electrical supplies.

Averagesupernova
Gold Member
I guess I need to be more clear. The dual voltage system was used to help avoid voltage drop. Two batteries of 100 volts each were wired in series. The lights were run on 100 volts. The neutral wire (the one connected to the node where the batteries are connected together) didn't carry any current when the currents from each battery were balanced. This obviously helped prevent voltage drop compared to running the same number of lights from a single 100 volt source. Why 200 volt bulbs were not used could be due to reliability as well as safety. Face it, most people don't get shocked by 220 volts in their home in the U.S. Nor would anyone likely have gotten shocked by more than 100 volts in the Edison system. It wasn't more than a few years ago that I read about places on the east coast of the U.S. were still served by DC. I'm am sure it has been switched over by now.

sophiecentaur
Gold Member
Oh boy. Was that really a system? It sounds crazy. How did they select which lights to connect which side? It would be a nightmare system to manage. Do you have any details (an engineering link)? The US is always full of surprises for me. Men on the Moon, forty years ago but DC mains supplies until just recently. I guess, if they were to hang on for another decade or so, they could stay with DC and use the very highest tech with low losses.

Averagesupernova
Gold Member
It really isn't that much different than split phase systems within buildings in the U.S. right now. What things get connected to which side? I suspect maybe every other light was divided up between sides. Something like this. This most likely started with street lights and was later added onto.

jim hardy
Gold Member
2019 Award
Dearly Missed
Looking at dlgoff's graphic
and replacing all the breakers with fuses like we had in 1950's

failure of one of the fuses at "main breaker" doesn't put the house into complete darkness. Half the 120 volt lights and outlets still work.

That's an ADVANTAGE of the split system - better reliability in the days when slide rules and fuses roamed the earth
but was it one of the reasons? I don't know.
Our electrical code suggests that every room have feeds from two different circuits so a single failure won't put that room into complete darkness. Extending that thought to the incoming power is certainly logical but i dont know if that was done.

Lots of houses here still have fuses. I've helped friends who were stumped when half their lights glow dim and get brighter when they switch on the stove. A single open main fuse had put out half the lights and when water heater cooled down it passed enough current from the "other phase" to run them partial brightness. When fuses get sixty or so years old they seem more prone to open spuriously especially if there's been roof leaks around the panel and corrosion has set in..

old jim

dlgoff
anorlunda
Staff Emeritus
Sophie's speculation about the historical motivations, does not account for the dates and the personalities. Edison's three wire scheme and choice of voltages evolved 1881-1882, the European BEW system came in 1899. By analogy, compare this with the NTSC (USA) versus PAL (Europe) TV standards. PAL is better, but it did not exist early enough for the Americans. If you have any technology that evolves in a way that demands backward compatibility, it becomes extremely hard to displace it with a better idea. How many people posting on this thread are typing on a non-QUERTY keyboard?

In the 1970s an engineer in Russia calculated that 100 Hz is more optimum than 50 Hz or 60 Hz for power. So what are the chances that the world converts to this idea because it is better? Zero, nada, zip, null.

[Note: the advantages of Edison's three wire system apply equally to AC or DC, so that is not part of the debate.]

Here is an interesting but short version of the history

[URL said:
http://www.equitech.com/articles/enigma.html][/URL]
In 1882, Thomas Edison wired the town of Sunbury, Pennsylvania using a shared-common three-wire DC distribution system. The cost of copper wire was an important factor, so Edison's engineers devised a way to distribute two circuits using only three wires. The forerunner to modern power distribution, DC and AC versions of the “Edison Circuit” are still widely used.

Beginning with the Niagara Falls power project initiated by George Westinghouse and supervised by Nicolai Tesla, AC distribution overtook DC and became the primary power system. With mostly lights and electric heaters loaded on the power grid, there was never much concern for voltage phase beyond what was considered to be "distribution convenience" for the utility. From almost the very beginning, 120-volt AC wiring has been conveniently unbalanced -- a "split" off half of a 240-volt single-phase grid.

Eventually, three-phase power was developed as a standard to suit heavy industrial users. Little was known at that time about harmonic distortion or other adverse effects on power systems created by impedance loading. Meanwhile, single-phase took a back seat to bulk three-phase power distribution because of the huge demand created by the efficient three-phase industrial motor. The three-wire Edison circuit was expanded another phase: 120/208- and 277/480-volt three-phase "wye" systems allowed for running three circuits using only four wires.

In electrical parlance, this multiple circuit wiring method is called a "Round Robin." The three-phase wye design enables single-phase fluorescent lighting, three-phase air conditioning and other three-phase motor loads (e.g. elevators) to be fed by one power distribution grid with the load current evenly distributed across the system -- ideal for commercial use.

But this system has one glaring fault. The level of interference created when a three-phase wye system is split up and used as three single-phase circuits is truly something to behold. For example, as much as 20% (or more) of the power used by fluorescent ballasts is reflected back onto the power grid in the form of reactive or harmonic currents -- now that’s a lot of distortion. In the late 80’s, a 40-plus-story office building in Los Angeles actually burst into flames because of these reactive currents. Incredibly, the origin of the fire was determined to be from excessive harmonic distortion in fluorescent lighting circuits which created a high-frequency current overload and literally a meltdown of the electrical wiring system. The First Interstate Bank fire in Los Angeles in May of 1988 was the event dubbed by the media as "The Towering Inferno" a la the Hollywood movie. Codes were adapted to remedy the fire danger, but the noise problem itself was never completely resolved.

Three-phase power nonetheless remains the bulk power of choice for utilities. When a utility furnishes single-phase service to an area, the standard procedure is for the utility to derive single-phase power by using one or two of the distribution grid's three phase elements. So even single-phase power is linked to the distorted three-phase grid. Typically, electrical power furnished by utilities contains 3% to 5% harmonic distortion. Single-phase service itself remains "split" into two 120-volt circuits as per Edison's original wiring design. In one form or another, these standards have been adopted and put into use around the world.
An important factor not mentioned in that history was that Edison's business model was to supply illumination only. Non-illumination applications of electricity evolved soon thereafter, but Edison was slow to adapt and he resisted them.

From an Edison Biography (https://www.amazon.com/dp/B0084AVTNK/?tag=pfamazon01-20, a book all engineers would love and available for free) I learned that Edison's engineering skills far outweighed his business skills. He went broke several times. His motivation for the three wire system was to save money on copper. Later, Edison invested in copper mines in Chile (the origins of Anaconda Copper) and he changed his mind. He thought that he could make more money selling copper than electricity. That, was his reason for opposing the Tesla/Westinghouse ideas. AC versus DC was only the proxy issue for public relations purposes. The true underlying dispute was about the business model for electric utilities.

Last edited by a moderator:
jim hardy
jim hardy
Gold Member
2019 Award
Dearly Missed

In a three phase system third harmonics all return through the neutral
rectifier power supplies produce a LOT of third harmonic
especially early computers

so when one takes an old office building that was wired with small neutrals
and puts computer equipment in every nook and cranny
he's apt to melt some of those neutrals.
US NEC now requires neutral for computer installations to be same size as phase conductors.
And there are regulations now for waveform and power factor correction for rectifier power supplies to both reduce RFI and harmonic production.

That's how codes evolve - through mistakes.

jim hardy
Gold Member
2019 Award
Dearly Missed
http://illumin.usc.edu/122/a-powerful-history-the-modern-electrical-outlet/fullView/ said:
In the United States, the Westinghouse Company chose to standardize the operating frequency to 60 Hz, as suggested by Tesla, eliminating nine other possibilities. In Germany, however, the standardization was much simpler because one company, BEW, had a monopoly on electricity. The outcome of BEW's standardization was 50 Hz, which was most likely chosen because it fit better with the 1, 2, 5 metric standard. This same company chose to supply its consumers with more power by raising their voltage from 110 volts to 220 volts in 1899. This trend, as well as the 50 Hz operating frequency, spread across Europe over the next few decades [10].
It has been suggested that the US switch to the 220-volt system. In the 1950's the US did consider switching but then decided against it since most consumers already had a number of 120-volt products. A compromise was reached when the US employed Edison's three-wire system: one wire supplied +120 volts, another supplied 0 volts, and a third supplied -120 volts, so that stoves, washers and dryers, and other large appliances could access 220 volts, while smaller appliances could still operate on the lower 120 volts [10].
If i'm not mistaken the first Niagara project was 20 hz. I recall seeing flicker in the incandescent bulbs as recently as 1970,

The house where i grew up was built in 1949 and had two prong outlets, hot and neutral , no grounding prong.
Fortunately the wiring was in metal conduit so in 60's Dad was able to upgrade to 3 prong , the metal conduit serving as earthing conductor.

anorlunda
Staff Emeritus
If i'm not mistaken the first Niagara project was 20 hz.
Pretty close Jim: 25 Hz. They continued to supply that until (I think) 1992 because the local factory motors that used that frequency had run 111 years with only a few drops of oil as their only maintenance (so goes the legend anyhow.) What a shame that they couldn't leave it alone to see if it would have lasted a millennium.

dlgoff and jim hardy