Here's my problem: Rs is a resistor whose value depends on temperature. Rs = R0 - 1.8T, where R0 = 5000 ohms and T is in Kelvin. If the ideal op amp has +/- 15 V power rails and you want to maximize the circuit's sensitivity, what should Rf be on the op amp? What is your maximum sensitivity at this resistance? A drawing of the circuit is attached as a file. R2 corresponds to Rs (the temperature-dependent resistor) and R7 = R8 correspond to Rf. Please note that the drawing shows the circuit solved for T = 72 degrees Fahrenheit and Rf = 300 kOhm. What I have done: I have solved the circuit (using Mathematica) to obtain an expression for Vout in terms of Rf and Rs. Circuit sensitivity is dVout/dT (first derivative of voltage with respect to temperature), so I also have an expression for this. Since the power rails on the op amp dictate the maximum Vout, I should have Vout = 15. Maximum sensitivity is the second derivative of Vout with respect to T set equal to zero, correct? So then I have two equations and two unknowns and can solve for Rf and Rs. Is this approach correct? Or am I missing something easier?