1. The problem statement, all variables and given/known data An inductor (Xl = 40.2 ohm) is connected in series with a resistance (30 ohm) and and an AC source ( 10V , 80Hz) and the current flows in this circuit is 0.2 amp. How could you reduce the phase difference between current and voltage to zero without changing the value of current nor the AC source? 2. Relevant equations 3. The attempt at a solution My text book says that we can do this in two different ways: First, We add a capacitor with a capacitance of 50 microfarad in series with the inductor, in this case the capacitive reactance is equal to the inductive reactance, and then they will cancel each other and the phase difference would be zero. (And that's what I though about when I solved the problem) Second, (and that's what didn't come to my mind and I don't understand) is to add another resistance to the circuit. And that's what the book did to measure its value: Z (in the circuit before doing any change) = √R^2 + Xl^2 = 50.2 ohm. The resistance added is Rx, Rx = Z - R = 50.2 - 30 = 20.2 ohm So the equivalent ohmic resistance in the circuit equals the impedance. I don't understand the second way, because the inductor is still in the circuit that means that its effect is still there, and it will try to make current lags voltage, right?