Heat from an electric current?

  • Thread starter Frostfire
  • Start date
  • #1
Frostfire
50
0
Ive posted this a while back but never had a reply,

How does one determine the heat generated from a current? I have found several sources that refer to using "the length of a wire" but how would you calculate it for an aqueous material, or for all intensive purposes a "really large" battery cell. Would = I^2 *R still work?

Also I haven't worked with high current problems before, I remember something about resistance increasing drastically with high current density.
 

Answers and Replies

  • #2
vk6kro
Science Advisor
4,081
40
If you were forcing current through an aqueous solution, I squared R would work. However, you couldn't predict R and you wouldn't know if it stayed constant.
So, it would be better to just use voltage times current. You can measure these OK.

There are heating effects in a battery but this is due to internal resistances in the leads to the electrodes, and the electrodes themselves, as well as limits in the chemical processes involved.

You can measure this internal resistance in a battery by loading it and noting the drop in terminal voltage. You could then predict the heating using I squared R.

Resistance in a solid conductor does increase with temperature for most substances, although there are some like Carbon and semiconductors, like Silicon and Germanium, that reduce their resistance with temperature.
 
  • #3
Frostfire
50
0
Interesting, so that brings up another question. If to calculate it in an aqueous solution involves working around resistance. If one was trying to minimize heat, use a material that behaves in a manner decreasing resistance with current, and to maximize use one that does the opposite?, That sound right, if incredibly over simplified
 
  • #4
vk6kro
Science Advisor
4,081
40
No, but you could compensate for the CHANGE in resistance of one substance by using a substance of opposite temperature coefficient with it so that you would get more of a constant resistance.

So, if you had one resistor that increased resistance with temperature, you could put another in series with it that decreased resistance with temperature. This could partly cancel out the change resulting in a more constant total resistance with temperature.

Early (Edison) lamps used carbon filaments. These reduced resistance with temperature, meaning they would draw more current and get hotter. This is a sort of runaway process that could destroy the lamp.
Later lamps used metal filaments that increase resistance with temperature, so they tend to protect themselves by drawing less current when they get hot.
Germanium transistors had the same thermal runaway problem and had to be designed into circuits that stopped this effect causing destruction.
 
  • #5
Frostfire
50
0
Thanks for the reply's, Ive been off for a while, you know how it is, out of class and work,try to focus on something else for a bit :wink: I hadn't thought about the thermal balance, interesting concept though,

So theory question, If one was trying to maximize the heat generated by a system, say a super electric heater based on high voltage, baring engineering limitations, what would be the best way to set it up to maximize heat induced from a given voltage, at a given amperage as well if required, over a given time?
 

Suggested for: Heat from an electric current?

  • Last Post
Replies
5
Views
527
Replies
8
Views
504
  • Last Post
Replies
9
Views
167
  • Last Post
Replies
9
Views
412
Replies
1
Views
664
Replies
3
Views
581
Replies
14
Views
713
  • Last Post
Replies
5
Views
374
  • Last Post
2
Replies
40
Views
1K
Top