Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Power dissipation limit for a resistor

  1. Oct 21, 2014 #1
    I am confused about the concept of a power dissipation limit for a resistor. Basically for a resistor the product of the current through it and the potential across it should not exceed 0.25 watt otherwise it starts to heat up and act in a non-linear fashion. Is this value just a constant or is there a way of obtaining it for resistors of different ohm values.
  2. jcsd
  3. Oct 21, 2014 #2


    User Avatar
    Science Advisor
    Gold Member

    I am not sure what you are asking. Since the resistor "locks" the voltage to the current (U=RI) it is usually easier to think the maximum dissipation as imposing a limit on how much current you can push through a resistor. This amount will of course depend on the value of the resistor since the dissipation can also be written W=R*I^2 (or if you prefer to think in terms of voltage W=U^2/R)
    Note also that 0.25W is just the value for a specific series of resistors, it is just related to how it is designed and there is nothing "fundamental" about it and you can get resistors that can tolerate more (or less) power.
  4. Oct 21, 2014 #3
    The resistor is a heat source - It is really all about heat -- above a given temperature the resistor will have a diminished life, and beyond that fail in a very short time. Also note that the cooling of the resistor is a key factor, the DS will indicate the ambient , static air temp that the (0.25W) rating is at. Air Temp, flow, other local heat sources all will affect the true limit. Some power resistors are designed to be mounted to heatsinks - the better the HS- the higher the wattage it can dissipate - in these cases the DS has a very specific case for the Wattage rating.
  5. Oct 21, 2014 #4
    Hint1: make it big and with big area
    Hint2: anybody for a manganin?
  6. Oct 21, 2014 #5
    Basically what I'm asking how would you determine a range of safe voltages for a resistor. And what voltage would result in heat up that leads the resistor to act in a non-linear fashion. I was told the product of current and voltage can't exceed 0.25 watt although I don't see how they got this value. Is there a way of calculating this value and is it determined by the resistance value or how they are connected, ect?
  7. Oct 21, 2014 #6


    User Avatar
    Gold Member

    The wattage (power) is how much heat the resistor body can withstand. The power dissipated in the body is the resistance times the square of the current. You can get resistors that withstand 1/4 watt and you can get resistors that can withstand 1000 watts, etc.

    If you put a voltage across a resistor that causes (via Ohm's Law) enough current to flow for the power equation to exceed the wattage rating, the resistor will begin to fail. Do it strongly enough, quickly enough, and the resistor will POP in two because it can't dissipate the heat fast enough and some of the body turns to smoke/gasses and it explodes.
  8. Oct 21, 2014 #7
    Since the Power (W) is proportional to the Square of the V -- the "safe voltage" tends to be a relatively narrow band unless you change the cooling. Perhaps it will help to explain why this issue is of such interest. Reading your OP it is not that it starts to heat up at 0.25 W ... at 0.25W it stays within in spec.... it start heating up as soon as there is any current.
  9. Oct 21, 2014 #8


    User Avatar
    Gold Member

    Although generally right, that's a bit strong as a blanket statement. How about a 10 ohm 1000 watt resistor? The "within spec" voltage range is 0 volts to 100 volts. That doesn't seem all that limited.

    Also, just a quibble, but I think "band" is a poor description and "range" would be better since it always starts at zero.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook