What is the pro's and con's of each ?
Do you mean 110VAC versus 220VAC mains power? Do you mean in the context of home electrical power (like a clothes drier that runs off of 220VAC versus 110VAC)? Or do you mean European power 230VAC at 50Hz versus US power at 110VAC and 60Hz? What is the context of your question? And where did 250 come from?
I meant "European power 230VAC at 50Hz versus US power at 110VAC and 60Hz"
There is a variation of the 250v and 110v, there is 120,130 and 140 and there is 220,230,240 and 250 volts hence most electric appliances have the ~ sign across the input voltage.
That made me chuckle. The "~" sign on the input voltage for appliances would mean AC, not approximately. The ~ sign is used for both, but in the context of the power mains, it stands for AC. It is definitely true that there are tolerances on the various AC mains standards, and when discussing a particular power grid, the nominal voltage is usually what is listed. There are some pretty unusual standards around the world.
With respect to the EU 230Vrms at 50Hz versus the US standard of 110Vrms at 60Hz, the higher voltge in the EU gives more power available with less loss in the wiring, but requires higher voltage rated components in the power supplies and in the devices that attach to the power mains. In typical US home wiring, there are two 110Vrms wires (Hot1 and Hot2) and one Neutral wire, and the two 110Vrms lines are 180 degrees out of phase. So you can wire 220Vrms to your clothes drier or other high-power devices, and wire either 110Vrms circuit to other devices. This flexibility provides a lot of the EU voltage advantage, while allowing most devices to use the easier 110Vrms requirements for safety spacings and component specs.
I've heard that 50Hz versus 60Hz does make a difference in how the human body reacts to accidental shocks (one makes it harder to let go or something), but I don't know the details of that.
can you get shock by touching only the neutral wire? I heard sometime the Voltage between neutral and ground is 30v
Good question. I don't think they have a traditional neutral conductor in the EU (I could be wrong), but for the US, the NEC (National Electric Code) specifies that each neutral wire be connected to earth ground at the circuit breaker panel. So at the breaker box, the AC voltage on all the neutral wires is zero with respect to earth ground (and the hots are all at about 110Vrms with respect to the neutrals and earth ground). The respective hot/neutral pairs head out from the breaker panel through the home or office and connect up to their respective distributed parallel loads (lights, machines, etc.). The AC current that powers the circuit is flowing in both the hot and neutral wires, so the voltage on the neutral wire with respect to earth ground out at the end of the power cable is determined by the current flowing and the resistance of the neutral wire length back to the breaker box. If you have several amps of current through several ohms of wire, you will get several volts rms on neutral at the end of a long run of cable.
But 30V is a pretty big voltage to get on your neutral wire, so if you are seeing it, I'd suspect some problem with the grounding of the neutral wire at the breaker panel, or else some resistive connections (maybe corroding wire nut connections) in the series distribution of the neutral wire.
The place that you more often get a 30V-level surprise shock in your home is when you handle the cable TV coax and happen to ground yourself to something. The cable TV coax outer shield conductor is earth grounded back in the distribution network somewhere in your neighborhood, which can be a long way away. You can easily get several 10's of volts rms between that point and the earth ground at your house. The first few times I got shocks while connecting up cable TV stuff, I thought I was losing my mind....:yuck:
I don't know about the US but here in Egypt the ~ sign is used mainly for approximation, the reason I believe that is because most AC/DC adapters have the following rating on them
input : ~220 VAC
output : ~12 VDC
Thanks for the replay but why does the US use 110 v in the first place?
I seemed to remember that the American inventory Thomas Edison was involved in coming up with the US power grid, so I used AskJeeves.com to search on this:
thomas edison and 110V AC power
It gave several hits, but this is basically what I was looking for. The combination 110/220VAC system worked well for the early American systems, and was considered a breakthrough invention at the time:
I'm not sure why the EU countries went more for a basic 230VAC system, and I have no idea about the 50/60Hz differences. Probably some interesting history in that as well. Maybe try AskJeeves....
Since power (in watts) is essentially volts times amps (ignoring the phase angle), doubling the voltage allows you to halve the amps. The same amount of work is done. Lower amps allows you to use smaller gauge wire (cheaper) and has lower line losses (I squared R) and less voltage drop in the line.
However, the higher the voltage, the higher the insulation requirements and the more danger to the user.
As for Edison, he was actually a major impediment in the development of AC power. Many of the frequencies and voltages used today are leftovers from the developments of Nikkoli Tesla, who designed the first 3 phase devices for Westinghouse. If you ever want to read some interesting (and sometimes gruesome) stories, look into the war that Thomas Edison (advocator of DC current) and George Westinghouse (advocator of AC current) got into in their efforts to control the early energy markets. Edison performed all sorts of psuedo-scientific experiments in electrocution to try to prove the inherent dangers of AC current vs DC current. It's actually one of the reasons that the infamous electric chair came into use in the USA for executions.
"Nikkoli Telsa, A Man Out Of Time" is a good start.
Welcome to PF, WFO - good post.
Separate names with a comma.