Register to reply

Heat from an electric current?

by Frostfire
Tags: current, electric, heat
Share this thread:
Frostfire
#1
Jun14-10, 12:32 PM
P: 50
Ive posted this a while back but never had a reply,

How does one determine the heat generated from a current? I have found several sources that refer to using "the length of a wire" but how would you calculate it for an aqueous material, or for all intensive purposes a "really large" battery cell. Would = I^2 *R still work?

Also I haven't worked with high current problems before, I remember something about resistance increasing drastically with high current density.
Phys.Org News Partner Engineering news on Phys.org
Printing the metals of the future
New gadget helps the vision-impaired to read graphs
3D printing helps designers build a better brick
vk6kro
#2
Jun14-10, 06:41 PM
Sci Advisor
P: 4,016
If you were forcing current through an aqueous solution, I squared R would work. However, you couldn't predict R and you wouldn't know if it stayed constant.
So, it would be better to just use voltage times current. You can measure these OK.

There are heating effects in a battery but this is due to internal resistances in the leads to the electrodes, and the electrodes themselves, as well as limits in the chemical processes involved.

You can measure this internal resistance in a battery by loading it and noting the drop in terminal voltage. You could then predict the heating using I squared R.

Resistance in a solid conductor does increase with temperature for most substances, although there are some like Carbon and semiconductors, like Silicon and Germanium, that reduce their resistance with temperature.
Frostfire
#3
Jun14-10, 07:56 PM
P: 50
Interesting, so that brings up another question. If to calculate it in an aqueous solution involves working around resistance. If one was trying to minimize heat, use a material that behaves in a manner decreasing resistance with current, and to maximize use one that does the opposite?, That sound right, if incredibly over simplified

vk6kro
#4
Jun14-10, 08:48 PM
Sci Advisor
P: 4,016
Heat from an electric current?

No, but you could compensate for the CHANGE in resistance of one substance by using a substance of opposite temperature coefficient with it so that you would get more of a constant resistance.

So, if you had one resistor that increased resistance with temperature, you could put another in series with it that decreased resistance with temperature. This could partly cancel out the change resulting in a more constant total resistance with temperature.

Early (Edison) lamps used carbon filaments. These reduced resistance with temperature, meaning they would draw more current and get hotter. This is a sort of runaway process that could destroy the lamp.
Later lamps used metal filaments that increase resistance with temperature, so they tend to protect themselves by drawing less current when they get hot.
Germanium transistors had the same thermal runaway problem and had to be designed into circuits that stopped this effect causing destruction.
Frostfire
#5
Jul10-10, 07:08 PM
P: 50
Thanks for the reply's, Ive been off for a while, you know how it is, out of class and work,try to focus on something else for a bit I hadn't thought about the thermal balance, interesting concept though,

So theory question, If one was trying to maximize the heat generated by a system, say a super electric heater based on high voltage, baring engineering limitations, what would be the best way to set it up to maximize heat induced from a given voltage, at a given amperage as well if required, over a given time?


Register to reply

Related Discussions
How does electric current change electric potential? Classical Physics 3
Probability current versus electric current Quantum Physics 6
Whats the difference between an electric field and an electric current? General Physics 3
Electric current and direct current circuits Introductory Physics Homework 14
Heat and electric current Introductory Physics Homework 1