Current through resistor, with heat. What happens to the current?

Click For Summary
SUMMARY

When a constant voltage is applied to a resistor with a current of 1 A flowing through it, the current will decrease as the resistor heats up, provided the resistor is made of wire. This is due to the increase in resistance that occurs with temperature rise. However, for an ideal resistor that maintains constant resistance regardless of temperature, the current remains unchanged. This distinction is crucial for understanding the behavior of real versus ideal resistors in electrical circuits.

PREREQUISITES
  • Understanding of Ohm's Law
  • Knowledge of resistance-temperature relationship in conductors
  • Familiarity with electrical components and their properties
  • Basic principles of heat dissipation in resistors
NEXT STEPS
  • Study the effects of temperature on resistance in materials
  • Learn about ideal versus non-ideal resistors
  • Explore the concept of thermal runaway in electronic components
  • Investigate the role of heat sinks in managing resistor temperatures
USEFUL FOR

Electrical engineers, physics students, and anyone interested in the thermal behavior of resistors in circuits will benefit from this discussion.

rcmango
Messages
232
Reaction score
0

Homework Statement



Suppose a current of 1 A is flowing through a resistor. If this makes the resistor heat up, will the current through the resistor increase, decrease, or remain constant? Assume the voltage applied to the resistor is constant.

Homework Equations





The Attempt at a Solution



Please help me with this, I'm not sure, but I believe the current will decrease slightly because of the heat dissipated?
 
Physics news on Phys.org
rcmango said:

Homework Statement



Suppose a current of 1 A is flowing through a resistor. If this makes the resistor heat up, will the current through the resistor increase, decrease, or remain constant? Assume the voltage applied to the resistor is constant.

Homework Equations





The Attempt at a Solution



Please help me with this, I'm not sure, but I believe the current will decrease slightly because of the heat dissipated?

The resistance of many electrical components changes with temperature. Certainly the resistance of a wire increases when it gets hotter, so if the resistor is actually made up of a length of wire you can expect that change in resistance - with an effect on the current I am sure you can predict.
 
Okay, so as the wire heats up the resistance of the wire increases, this causes the current to decrease. Is that correct?
 
rcmango said:
Okay, so as the wire heats up the resistance of the wire increases, this causes the current to decrease. Is that correct?

That sounds good.

EDIT: provided the resistor is made using wire.
 
Last edited:
The current decrease is NOT because of the heat dissipated. For an ideal resistor that does not change resistance even if it heats up, the current doesn't change.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
2K
Replies
44
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
Replies
5
Views
6K
Replies
3
Views
1K