Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Calculating heat output from current

  1. Feb 13, 2012 #1
    This is actually a heat transfer question but what I don't understand is the electrical part of it.

    Given the current (in amps) and the resistivity (in ohm x seconds), how do I figure out how much heat is being generated? I would think you would need to know the voltage as well, but it is not given.
     
  2. jcsd
  3. Feb 13, 2012 #2

    vk6kro

    User Avatar
    Science Advisor

    You don't need to know the voltage but you do need to know the dimensions of the resistor so that you can work out the resistance.

    Once you do know the resistance, Power = I 2 * R
     
  4. Feb 13, 2012 #3
  5. Feb 15, 2012 #4
    I found the formula for this but was still having trouble grasping it conceptually, but I just realized something.

    A few weeks ago I was talking to some mechanical and electrical engineers about combined heat and power. Combined heat and power, aka co-generation or co-gen, is generating electricity on site, usually by burning natural gas, and using the waste heat to heat your buildings, domestic hot water, or some other process.

    All of the mechanical guys were saying what a great system it is, and amongst our reasons for this is minimizing voltage drop by generating electricity on site. The electrical guy disagreed, saying that voltage drop is negligible and the grid system is not flawed (although the generation systems certainly are).

    Now I realize that the part that I wasn't understanding about the inefficiencies of the grid and the part I was not understanding about this problem are one in the same. The heat generated by electrical current, which represents the energy wasted through voltage drop, depends on current only, not voltage. So when power is transmitted at high voltage, since the current is therefore low, so is the heat dissipation, and so is the wasted energy.

    Say you needed to transmit 100,000 VA from point A to point B. If you were to transmit it at 100V, the energy wasted through heat dissipation would be proportional to the 1,000A current. If you were to step up the voltage to 100,000V, the current would only be 1A and the wasted energy would be proportional to 1A. Because heat generation depends on current ONLY, not voltage.
     
  6. Feb 15, 2012 #5

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    Transmission guys design for the loss they want to tolerate,
    let's just say they can afford to lose 3% of the power along a line.
    Power is volts times amps.

    At high voltage the current is small so you can use small conductors and get 3% voltage drop. In your example you could tolerate 3kv drop at 1 amp, 3000 ohms.

    At low voltage the current is high so you need large conductors to get 3% voltage drop.
    In your example that would be 3 volts at 1000 amps which is 0.003 ohms.

    Resistance of wire is in proportion to 1/(its area) and area X length is volume of copper required to make the wire...
    So for a given line length, 3000/.003 = 1,000,000
    A thousandfold increase in voltage yields a millionfold decrease in copper required?
    somebody please check my arithmetic and logic....i havent had my coffee yet.

    Textbooks say it's mostly the cost of those huge conductors and the impracticality of towers to support them that makes high voltage economical. Longer insulators are cheap in comparison.
    #10 wire is quite close to one milli-ohm per foot. It's about diameter of a soda straw.
    To get .003 ohms per thousand feet would be something like three hundred strands of #10, fifteen hundred strands to get .003 ohms per mile..
    but a single strand of it is only 5.2 ohms per mile, more than adequate for your high voltage transmission.
    I think your example drives home what the textbooks say, but again sanity checks are welcome.


    Your mechanical friends are doubtless looking at the thermodynamic cycle.
    Exhaust steam from a turbine is condensed into water so you can pump it back into the boiler, and the heat of vaporization is lost. That amounts to ~1000 BTU/pound which is quite a lot.
    If you can loan that exhaust steam to a nearby bulding where they can condense it to warm themselves then pump it back to you condensed into water, everybody wins including the environment.



    old jim
     
    Last edited: Feb 15, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook