1. The problem statement, all variables and given/known data An inductor with L = 50 mH is in series with a resistor of R = 180 ohms. At t = 0, a potential difference of 45 V is suddenly applied across the series circuit. At what rate is the current increasing after 1.2 milliseconds? 2. Relevant equations V = IR + Emf Emf = -L(dI/dt) I = dQ/dt 3. The attempt at a solution My first instinct was to try and find the maximum current, or when the rate of change of the current is 0, but I'm not sure that doing that would accomplish anything. I've been puzzling over how to relate the concepts with differential equations, but I've gotten pretty stuck there, too. Any points in the right direction would be greatly appreciated! Thank you!