Why do utility companies decrease the voltage sent houses?

In summary: I stuck a metal object in between your two terminals of a battery, you would not be able to touch either terminal with your bare hands because the current would be so high. So while current is important, voltage is still the more important factor.
  • #1
partialfracti
22
0
I have read that most power lines on utility poles in residential neighborhoods carry about 4800 volts. My sources say that the transformers on utility poles decrease the voltage from 4800 volts to about 220 volts in most houses. Someone told me that the utility companies decrease the voltage sent to houses from 4800 volts to 220 volts to make the electrical power in houses less likely to kill a person who is shocked. That answer seemed plausible to me until I learned that transformers decrease voltage by increasing enough current that the total power output never changes.

Why do utility companies decrease the voltage sent to houses to 110 or 220 volts instead of 4800 volts?
 
Engineering news on Phys.org
  • #2
It is decreased because that type of voltage is impracticle and unnecessary in residences. We don't drive semi tractors to work because of the same reason.
 
  • #3
Safety for one!

They transmit power in much higher voltage so they can get more power through with less amps so they can use smaller wires ( if you can call that!). When come down to final distination, they step it down and sent to each house as 220 single phase.

In the middle of the transmission ( inter city or state), I think the voltage is much higher than 4800V.
 
  • #4
Part of the reason is that transmission of power is quite lossy, and if you transmit for example 100V * 10A = 1000W, then your power loss = I^2*R = 10*10*R = 100R

If you instead, via a transformer, use 1000V * 1 A = 1000W, then your power loss is only 1*1*R = R

If you carry this on and use 100,000V * 0.01A = 1000W, then your power loss is 0.01*0.01*R = 100*10^-6R.

But most appliances need more than 0.01A, so while transmitting at a much higher voltage and lower current (the same power all the way through) via a transformer at the power station works well for distribution, for domestic use it needs to be transformed back into something usable.
 
  • #5
Averagesupernova said:
It is decreased because that type of voltage is impracticle and unnecessary in residences. We don't drive semi tractors to work because of the same reason.

I suspect that this might be true. In my mind the only potential problem with your answer is that the current is increased as much as voltage is decreased to keep total power the same.

When the transformers increase the current going from a utility line to a house, is that type of current practical and necessary in residences?
 
  • #6
yungman said:
Safety for one!

That answer does not make much sense to me because the current is what kills a shocked person, not the voltage per se. The transformers increase the current.



They transmit power in much higher voltage so they can get more power through with less amps so they can use smaller wires ( if you can call that!).

Ah. I think this is a hint to the final answer.
 
  • #7
Averagesupernova, yungman, and zryn, my conclusion from what I've read from your posts is that the answer to my original question is the following:

Utility companies decrease the voltage sent to houses because of two reasons: (1) 4800 volts is an unnecessarily high and wasteful amount of voltage to run household appliances and (2) the corresponding increase in current is necessary to have enough current to run household appliances.

Averagesupernova, yungman, and zryn, do you agree with my conclusion?
 
  • #8
I would say that all of us who posted are correct. We could make our electrical systems safe for 4800 volts but it would be impractical to do so. It is NOT that all of our appliances require more than the fraction of an amp. This is not so. A heating element in your oven for instance could easily roast a turkey passing only a half an amp as long as it is driven with a 9000 volt source. That would be about 4500 watts which I believe is common for most ranges, or at least close. But building such a heating element is not the practical way to do it. The way we do it now is much more practical.
-
As for your comment about the current:
That answer does not make much sense to me because the current is what kills a shocked person, not the voltage per se. The transformers increase the current.
It takes voltage to push the current through your body. An automobile battery can supply hundreds of amps which is over a 1000 times the current that can kill a person. But with dry hands a 12 volt car battery cannot push enough current through the human body to do any harm. 4800 volts can EASILY push enough current through a human body to kill. For instance, suppose (hypothetically) you redesigned your house to run on 4500 volts. Right now you may have a 240 volt service that can supply up to 200 amps. That is 48000 available watts. To get the same power at 4500 volts the supply would need to be able to source 10.66 amps. Much less current right? But WAY more than enough voltage to push a fraction of that 10.66 amps through your body. So, which do you think is safer now?
 
  • #9
Power delivered to households is at 120 volts or 230 volts (depending on where you live) because this is the agreed voltage for appliances and lighting.

This was agreed for various reasons, but is not optional.

The current is not decided by the power company. It is a result of which appliances you turn on. The power company supplies a voltage and the current depends on the appliance.

Exactly how the power company supplies 120 volts or 230 volts to your house is decided on technical considerations

Supplying power at high voltage means you need better insulation but can get away with thinner wires that contain less copper, for the same power.
Conversely, the same power delivered at a lower voltage will need more copper but not such large insulators on the poles.

Power transmission is nearly always done at much higher voltages than the domestic supply because this makes more efficient use of the expensive power poles and power lines.
 
  • #10
partialfracti said:
That answer does not make much sense to me because the current is what kills a shocked person, not the voltage per se. The transformers increase the current.


True if you have wet hands. But in real life, 99.9% of people have dry hands and that WILL make a day and night difference. This is like touching the terminal of a 9V battery with your finger vs licking the terminals with your tongue. Resistance of dry skin is very high. You can push enough current to kill people. When I designed the ground fault detector, the spec was a few mA, so you are talking about in 100Kohm resistance. Hold the ohm meter probe between your hands, you get more than 1M ohm. You wet your hands, that would be different. Water get into you pourous and into blood ( salt ), then you are talking about conduction.


Ah. I think this is a hint to the final answer.

......
 
Last edited:
  • #11
partialfracti said:
Averagesupernova, yungman, and zryn, my conclusion from what I've read from your posts is that the answer to my original question is the following:

Utility companies decrease the voltage sent to houses because of two reasons: (1) 4800 volts is an unnecessarily high and wasteful amount of voltage to run household appliances and (2) the corresponding increase in current is necessary to have enough current to run household appliances.

Averagesupernova, yungman, and zryn, do you agree with my conclusion?

I think they have more reasons than what we posted. It is just some of the more obvious reasons. Unless you are a power distribution engineer, you really don't have the full picture.

That is a good point that if you bus over 1KV into every house, the insulation cost will be tremendous. 220V, you really don't have to have too much precausion in insulation ( relatively).
 
  • #12
As with static / lightning, it is the electical potential between two points that causes the 'spark'. If you up the voltage in the home, you increase the chance that it arks. To put it simply.

So yes, as pointed out previously, insulation costs (and safety standards) go through the roof.
 
  • #13
There's tons of reasons why they transform the voltage down, and some of the better ones have already been listed. The real question, in my mind, is why they decided on 120 and 230V. Nothing in engineering is truly arbitrary.
 
  • #14
Mindscrape said:
There's tons of reasons why they transform the voltage down, and some of the better ones have already been listed. The real question, in my mind, is why they decided on 120 and 230V. Nothing in engineering is truly arbitrary.

You never know, like the track width of the railroad, I just learned recently that it has something to do with the wheel spacing of horse drawn wagon or something!:rofl::yuck:
 
  • #15
Mindscrape said:
There's tons of reasons why they transform the voltage down, and some of the better ones have already been listed. The real question, in my mind, is why they decided on 120 and 230V. Nothing in engineering is truly arbitrary.

Good question, according to wikipedia it seems quite arbitrary. But I believe it has somthing to do with the construction of filament lamps back in the 19 century.

In europe the voltage has varied from 220 to 240 and the standard is now 230V. With the use of electrical appliances now days, 230V might not be optimum voltage, but the widespread implementation is the conclusive factor.

400/230V TN is also implemented giving the possibility to use 400V 3 phase for motor applications which is better than 230V.
 
  • #16
The reason we have split phase 120/240 volts goes back to the DC days of Edison. The system was designed to run on 100 volts. Lamps at the end of the line ran dim due to the loss along the line. So they upped to voltage to 110 volts to help with that knowing it wouldn't harm lights at the beginning of the circuit with just 10% overvoltage. They used our current system of split phase so that the neutral (shared) didn't carry ANY current when both legs were balanced. One less wire carrying current means less loss and brighter lights at the end of the line. Over the years the system has crept towards 120 volts but anywhere between 110 and 120 is considered acceptable.
 
  • #17
The voltage used is an engineering compromise between higher voltages= more expensive insulation and higher shock risk vs lower voltages = less current available.

The actual choice of 110V or 230-240V is a bit arbitrary - it mostly depends on historical accident of the first implementors.

230V means you can have very high power devices - like 12KW electric showers, but it means you need special transformers for high risk applications like power tools.

110V means less power available for regular appliances so you need a separate 220V supply for the washing machine and dryer.
 
Last edited:
  • #18
Lower voltage means a higher current draw is required, which in turn increases resistance losses.
 
  • #19
yungman said:
Safety for one!

partialfracti said:
That answer does not make much sense to me because the current is what kills a shocked person, not the voltage per se. The transformers increase the current.
Higher voltage requires greater insulation due to it's ability to arc across larger gaps than lower voltage; look at the size of the insulators on the transformers supplying your home (if they are pole-mounted) - they are huge. If you had 4800 volts available at your house, everything would have to be bigger to avoid arcing: The cable would be bigger, the receptacles would be bigger (and the slots would be farther apart). Even then, if you got too close to the receptacle, the current would tend to "jump out and grab you."

In my area, the high voltage tends to be around 12,000 volts. If you've ever done work near power lines, you'd know that you should keep at least 10 feet away from them (that distance increases with variables such as voltage, type of tower or pole carrying the wires, type of work being done, etc.) The reason is to keep the current from arcing over to you or your equipment.

And by the way, transformers don't "increase the current" as you say, they lower the voltage, thus making more current available to the user.
 
  • #20
Averagesupernova said:
But building such a heating element is not the practical way to do it. The way we do it now is much more practical.

Why is the way we do it now more practical than building a heating element for 4800 volts?

I did not see zgozvrm's response when I posted this. I believe zgozvrm answered my question above.
 
Last edited:
  • #21
zgozvrm said:
And by the way, transformers don't "increase the current" as you say, they lower the voltage, thus making more current available to the user.

What is the difference between increasing the current and making more current available to the user?
 
  • #22
partialfracti said:
What is the difference between increasing the current and making more current available to the user?

You aren't increasing the current.

You are allowing the user to draw more if required. If the user doesn't draw more, then you haven't increased the current.
 
  • #23
partialfracti said:
What is the difference between increasing the current and making more current available to the user?

Suppose you have a 100A service to your home. That means that the main breaker and the feeder wires are rated for a maximum load of 100A. Now, if the power company comes out and replaces your panel with a 200A service (larger breaker and feeder wires), they have made more current available to you. But, you don't have to use that extra available current.
In order to increase the current, you have to turn on more appliances, causing more current to be drawn from the service.
 
  • #24
zgozvrm,

You have cleared it up well. I get it now.
 

1. Why do utility companies decrease the voltage sent to houses?

Utility companies decrease the voltage sent to houses as a safety measure. High voltage electricity can be dangerous and cause fires or damage to electrical appliances. By decreasing the voltage, the risk of accidents and damage is reduced.

2. How does decreasing the voltage affect the performance of electrical appliances?

Decreasing the voltage can affect the performance of electrical appliances by causing them to run slower or less efficiently. This is because appliances are designed to operate at a specific voltage, and if the voltage is lower than what they are designed for, they may not function as intended.

3. What is the standard voltage sent to houses by utility companies?

The standard voltage sent to houses by utility companies is typically between 110-120 volts in North America and 220-240 volts in Europe and other parts of the world. However, the exact voltage may vary depending on the location and the type of electrical grid in use.

4. How do utility companies regulate the voltage sent to houses?

Utility companies regulate the voltage sent to houses through the use of transformers. These devices can increase or decrease voltage levels as needed to ensure that the electricity being delivered is within a safe and optimal range for residential use.

5. Can decreasing the voltage save energy and reduce utility costs?

Yes, decreasing the voltage can save energy and reduce utility costs. When the voltage is lower, appliances use less electricity, which can lead to lower utility bills. Additionally, reducing the voltage can also help to extend the lifespan of electrical appliances, reducing the need for replacements and saving money in the long run.

Similar threads

Replies
37
Views
7K
Replies
3
Views
1K
  • Electrical Engineering
Replies
16
Views
3K
Replies
47
Views
6K
  • Electrical Engineering
Replies
4
Views
1K
  • Electrical Engineering
Replies
17
Views
11K
Replies
6
Views
5K
  • Electrical Engineering
Replies
24
Views
2K
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
10
Views
2K
Back
Top