Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How much resistance for a heating resistive element?

  1. Sep 3, 2015 #1
    Hello Forum,

    Most devices that are supposed to generate heat via resistance are connected to a constant voltage source. That means that the amount of power dissipated as heat is given by

    P= V^2/R = I^2*R

    This implies that the smaller the resistance the larger the current through the heating resistive element and the more heat is generated.

    but conducting wires have very small resistance. However they don't dissipate much power because the current is not controlled by them but by the constant voltage source and the resistive heating element with its resistance R...

    So, given a certain potential difference V, what is the ideal amount of resistance that a resistive heating element needs to have? If R is too small then too much current will flow: that may keep the dissipated power P small. If R is too large then the current I is too small (I^2) and the dissipated power is small again. There seem to be a suitable value of R to obtain the right and desired amount of heat generation...

    In general, I would think R needs to be "small", whatever that may mean, to generate a sufficiently large current (I^2) and dissipate enough heat. So resistive heating elements (toasters, stoves, etc.) are small resistance devices...

  2. jcsd
  3. Sep 3, 2015 #2


    User Avatar
    Science Advisor
    Gold Member

    There is a very wise man on this forum who says: A question well stated is half answered.
    I would say you have answered your own question. The size of the heating element (nichrome, etc.) is sized to produce the desired heat at a specified voltage.
  4. Sep 3, 2015 #3
    Ok :)

    good enough. Thank you.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook