Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Ac voltage drop accross power lines

  1. Jun 16, 2010 #1
    so the reason why alternating current is used to deliver energy to households is because for a given wire, the voltage drop is always less with ac rather than dc right?
    but when i calculate the voltage drop it coumes out to be the same!
    R=P(length of wire)/(cross section area of wire)
    i see no reason why an alternating current would drop less
    is there an equation i am missing?
  2. jcsd
  3. Jun 16, 2010 #2


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    No. The reason for using AC is that you can conveniently transform it from the generator voltage to an appropriate voltage for transmission and then for use in the home.

    Where did you get the idea that power loss is less? V=IR applies for all signals.
  4. Jun 16, 2010 #3
    Actually, DC will have lower I2R losses than AC given that the cable cross section is the same, due to skin effekt.

    The reason is like sophiecentaur pointed out, the ability to transform the voltage up and down. Today however power electronics make it posible to transform AC to DC and vice versa effectively which gives DC an advantage over AC in many situations. Underwater cables e.g.

    T.A. Edison's proposal of using DC wasn't so dumb afterall. Alltough semiconductors would not see daylight for 80 years later.

    In Europe we use 420kV and lower inn distribution-grid, and 230V inn households. The power delivered is the same, but the current much lower (P=V*I), and hence I2R losses lower. Work out the difference in power losses and you see the reason.

    Your resistans forumla seems to be missing "rho", ohm-meter.
  5. Jun 16, 2010 #4
    what? u mean P?
  6. Jun 17, 2010 #5
    The skin effect for 60 Hz is about 8.5 mm. http://en.wikipedia.org/wiki/Skin_effect This makes the diameter of wire where the skin effect begins to be noticeable about 19 mm or AWG #00000000.
  7. Jun 17, 2010 #6
    Multi-twisted wire with a supporting steal core significantly reduces skin effect at 60 Hz. Skin effect is not as much an issue as with solid conducts, which are not uses for this reason.

    To help clarify what sophiecentaur was saying, AC is used because it can be transmitted at higher voltages than needed, and cheaply converted down to useful voltages with transformers. The same cannot be said of DC transmission.

    The higher voltage transmission means that less current has to be carried by the conductor per unit power delivered. With less current, comes less power loss in the transmitting cables.

    So, in a way, the statement is true: For the same conductor size, AC power will transmit more power, but because it can be transmitted at a higher voltage than the end use.
    Last edited: Jun 17, 2010
  8. Jun 17, 2010 #7
    Sorry, didn't see that. Alltough its a lowercase "P", whereas uppercase P donates power.
  9. Jun 17, 2010 #8
    thank you, you've given me more than i asked for, but now i am curious
    take a look at the sketch of this wire:
    http://www.mahanson.com/images/Hendrix%20Cable.jpg [Broken]
    those copper wires wrapping around the outside must be for reducing the skin effect right?

    i have been reading up on the skin effect:
    i know the equation now, and its also telling me i should look up something called the proximity effect, which may or may not have any effect in this situation

    my new question is, that if i know the dimensions of this wire, the thickness, degrees per feet of how the outer wires wrap around etc.
    can i calculate the skin effect for this oddly dimensioned cable?
    or is the only way to know how different wrapping techniques will effect the skin effect to try them in an experiment?
    Last edited by a moderator: May 4, 2017
  10. Jun 17, 2010 #9
    To my knowledge the cable in the picture has no such function.

    Aluminum Conductor Steel Reinforced (ACSR), is the name of the cables used in power-transmission. And looks like http://www.powercablemanufacturers.com/picture/aerial-cable/acsr-cable.jpg" [Broken] The one in the link is non-insulated and used in overhead powerlines.

    Not shure i got your last question but,
    For a round cable you only need to know the skin depth, and the current only travels in the outer layer of the cable a distant (skin depth) from the outer side.
    For aluminium at 50Hz the skin depth is 12mm. And with a cable larger than 12+12=24mm inn diameter no current travels in the center of the cable. Hence you can fill it with whatever you want.

    So why use cables larger than 24mm you say? The area in which the current travels still increase with larger diameter, and the resistance gets lower.

    Proximity effect comes into account when two cables lies near each other and their magneticfield acts on each other. This decreases the conductivity of the cables.
    Last edited by a moderator: May 4, 2017
  11. Jun 17, 2010 #10

    one more thing, the wires are wrapped around each other in the middle but not insulated from one another, so do i have to account for some inductance in regards to voltage drop over a distance?
  12. Jun 18, 2010 #11
    The cable can be threated as one solid wire, the reason why the cords are small and twisted are due to the flexibility of the cable. No insulation are needed, the steel inside has only the function of mechanical stengthening.

    A transmission line isnt purely resistiv, as a whole system (3 phases) it has resistance, inductance and capacitance. And must be threated as a impedance.

    Inductance due to proximity effect and self inductance.
    Capacitance to earth.

    For simplicity, if you dont know complex calculations, just take resistivity into account when calculating voltage drop. But use the correct cross-section if using large cables.(skin effect)

    The importance of the impedance becomes important when calculating short-circuit-currents.
  13. Jun 18, 2010 #12
  14. Jun 21, 2010 #13


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    In practical terms, AC is a far better bet than DC for distribution but there is not only the impedance / power factor consideration. The need for synchronisation of multiple generators is always there. That's (at least one reason) why they use a DC link between UK and France and it's worth all the additional rectification and inverters.
    Despite the very low frequency of mains AC, lines of a thousand miles or more in length can introduce some strange transmission line effects. For instance, a quarter wavelength long line which is open circuit at one end will look like a short circuit at the other - a potential embarrassment!!
  15. Mar 9, 2011 #14
    how does setting up a substation between a long distance transmission line reduce transmission losses ?
  16. Mar 9, 2011 #15


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    If there is no significant step-up in Voltage (e.g 400V to 11kV), followed, at the other end, by a step down then there won't be. It is only when the current flowing in the long line is reduced significantly that IsquaredR losses are reduced.

    But a step-up will mean that the consumers down the line may get nearer to their required supply volts.
  17. Mar 9, 2011 #16
    ohk, another query why are transmission voltages in factors of 11 ? if it's beacause of form factor ... how does it depend on the form factor plz give me a formula for calculating the required transmission voltage according to distance of transmission and load 2 b transmitted..
  18. Mar 9, 2011 #17


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I don't know where the 11 comes from. I suspect, though, that is as simple as giving yourself 10% headroom and starting from an arbitrary 100V. But the original 250V (or the subsequent 240V) in the UK is not a multiple of 11 and neither is 400kV.

    Someone, somewhere may have sat down and done some sums but I bet the values were pretty well arbitrarily chosen.
    aamof, with the comparatively long domestic voltage distribution lines in the US, it's surprising that 110V was chosen when you think that lines in the UK are, on average, a lot shorter because most housing is a lot more dense, in comparison and we chose 250V.
  19. Mar 9, 2011 #18


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I think the voltages are a bit arbitrary. 110V may have been adopted after a choice of 100V and then an allowance for voltage drop on inadequate supply cables. Strange that the UK chose 250V, originally, then went to 240 and then to 230, for unity with Europe. No multiples of 11 there - and neither on the 400kV standard.

    The choice of ratios for low, intermediate and high voltage transmission would have been based on cost of towers, cable and a lot of other factors but in a pretty fuzzy way, I'm sure. We really need someone (like my Dad, for instance) who was around when some of these standards were brought in (not the first ones, of course - I'm not that old).

    It always seemed strange to me how the 110V was chosen in the US, where spacing between dwellings is relatively high but 250V was chosen in the densely populated UK where cables would have been significantly shorter, on average, with lower resistive losses. Were the Americans so rich that they could afford all that extra copper?
  20. Mar 9, 2011 #19
    I'm not sure what you mean here. Can you expand on that a bit? Preferably with a diagram of some kind.

    You are referring to the wavelength of the AC signal with respect to c (speed of light), correct? If I remember c right, 60 Hz has a 5km wavelength, 50 Hz is 6km. So a quarter wavelength line would only have to be a couple miles long at most, not thousands.
  21. Mar 9, 2011 #20


    User Avatar
    Science Advisor
    Gold Member

    I think you need to redo that math Jiggy. 1/4 wave at 60 hertz is in hundreds of Km.
  22. Mar 9, 2011 #21
    Bah, my bad. 300,000 km/s, not 300,000 m/s. Stupid units.
  23. Mar 9, 2011 #22


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I would love to know the origin of that, have never seen anything specific in print.
    here in Australia and New Zealand we also use the 220-240V standard.
    In my travels around the USA, I was very suprised at the "interesting" cabling in some homes at 110V. Made me cringe when one considers that the current flowing at 110V is twice that than at 220V for an appliance of the same wattage.
    Mind you on trips to the Philippines (my wife's home land) looking at the way 110V is strung around that and connected into with basic barrier connectors really freaks you out haha

  24. Mar 9, 2011 #23


    User Avatar
    Science Advisor
    Gold Member

    I made a post about the origins not long ago. I don't recall what thread though.
    Found it, here it is:

    The reason 100 volts was picked (if you read my link) was because it was decided that 100 volts was high enough to be practical to do the job by keeping the current low enough to prevent excessive loss but 100 volts was low enough to be considered relatively safe. I recall reading these things just a few days before I posted in that thread but I can't recall where. Try wiki I suppose.
    Last edited: Mar 9, 2011
  25. Mar 9, 2011 #24
    But the voltage is still half that. And it's voltage that causes current to flow. I can't figure out why people think that the higher amps are dangerous.

    Looking at a naive, extreme example, which would you rather handle: 1kV@1mA or 1mV@1kA?* Each one is only 1W, but I think there's a big safety difference there.

    If we were dealing with constant current sources and our bodies were being put in series with the load, then the higher current would be bad. 1mA would be hardly noticeable, but 1kA will cook your goose pretty tender (if it doesn't explosively boil away).

    But, the vast majority of power sources are constant (somewhat) voltage sources, and most of the time (I think at least) a shock happens because our bodies provide a path to ground in parallel with the load. In the parallel case, it's the voltage that's applied across the body. 1mV is harmless, but 1kV is serious stuff.

    By my thinking, 1mV@1kA would be much, much safer to handle than 1kV@1mA. That's the reason why the long distance transmission wires that carry the huge voltages (hundreds of kilovolts) are put so high up and so carefully insulated, even though they carry less current; it's the voltage value that's dangerous.

    From this analysis, I conclude that 240V is more dangerous than 120V; though 120 is still plenty dangerous.

    If my thinking is too naive, I'd love to be corrected.

    Unless I misinterpreted you, and you were referring to the higher current creating more heat and increasing the risks of electrical fires and the like. In which case this entire post is pretty misguided.

    * Yes, I realize that these numbers are highly unrealistic, but I've found that the quickest way for me to understand things is to talk about what happens at the extreme ends rather than muck around with a dozen middle-of-the-road examples.
  26. Mar 9, 2011 #25


    User Avatar
    Science Advisor
    Gold Member

    I would generally say that higher voltage lower current is more dangerous. But, think about the poor guy who pops the hood on his car and accidentally gets his metal watch band between the positive terminal of the battery and a metal part of the car. Watch band is red hot in a second.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook