# Power Dissipated in a Resistor (really basic, but confused)

## Homework Statement

I understand the maths... I'm here to ask WHY we have to do it this way.

The question states:
"The power dissipated in a resistor is given by $P= E^2/R$. If $E=200$ and $R=8$, find the change in $P$ resulting in a drop of $5 Volts$ in $E$ and an increase of $0.2 Ohms$ in $R$."

Above.

## The Attempt at a Solution

Physically I was thinking, okay plug in $200$ and $8$ then subtract from that answer the power calculated when $195$ and $8.2$ are input into the equation.

This gives Change in power$\approx362.8W$

My line of thought was, well if I have a resistor of 8 Ohms and a voltage of 200 across it the power will be a certain value. Then if I had a similar resistor of resistance 8.2 Ohms and a Voltage across if of 195 V then the difference when these values are put into the equation will be the change in power.

Why is this NOT the case? Namely the true answer is apparently: 375W,

You get this by doing the partial derivative of the equation with respect to E and R, ive done the math and it checks out to that answer alright, but as stated- What is wrong with what I have done?

What is my fatal assumption?
Is it because the changes are small and thus calculus needs to be involved?

Thanks for any responce.

Andrew Mason
Homework Helper
Your answer is correct. The precise change in power is 362.8 W. If you take the rate of change of power as a function of voltage x change in voltage + the rate of change of power as a function of resistance x change in resistance, you will only get an approximate answer since P is not a linear function of E or R.

AM