I understand the maths... I'm here to ask WHY we have to do it this way.
The question states:
"The power dissipated in a resistor is given by [itex]P= E^2/R[/itex]. If [itex] E=200[/itex] and [itex] R=8 [/itex], find the change in [itex] P [/itex] resulting in a drop of [itex] 5 Volts [/itex] in [itex] E [/itex] and an increase of [itex] 0.2 Ohms [/itex] in [itex] R [/itex]."
The Attempt at a Solution
Physically I was thinking, okay plug in [itex] 200 [/itex] and [itex] 8 [/itex] then subtract from that answer the power calculated when [itex] 195 [/itex] and [itex] 8.2 [/itex] are input into the equation.
This gives Change in power[itex]\approx362.8W [/itex]
My line of thought was, well if I have a resistor of 8 Ohms and a voltage of 200 across it the power will be a certain value. Then if I had a similar resistor of resistance 8.2 Ohms and a Voltage across if of 195 V then the difference when these values are put into the equation will be the change in power.
Why is this NOT the case? Namely the true answer is apparently: 375W,
You get this by doing the partial derivative of the equation with respect to E and R, ive done the math and it checks out to that answer alright, but as stated- What is wrong with what I have done?
What is my fatal assumption?
Is it because the changes are small and thus calculus needs to be involved?
Thanks for any responce.