- #1

- 821

- 31

## Main Question or Discussion Point

I'm having a bit of trouble wrapping my head around why inductors behave the way they do in certain circuits. Every physical explanation for why ##V = -L\frac{dI}{dt}## that I've seen explains how the voltage across the inductor is developed as the current changes: the changing current means a changing magnetic field, which induces an electric field along the wire. This electric field provides an EMF which, if the inductor has zero resistance, equals the potential difference that develops across the ends of the inductor.

However, in many—most, even—circuits of interest it is a (possibly time dependent) voltage that's

However, in many—most, even—circuits of interest it is a (possibly time dependent) voltage that's

*applied*across an inductor and current thus builds up according to the integral of the above equation. While I can see well enough how this works mathematically—the equation is valid whether ##I## or ##V## is the independent variable—I have a hard time turning the above argument around in order to understand physically what's happening. In terms of Maxwell's equation (and perhaps the Lorentz force law, if necessary) how can you derive the relation ##\frac{dI}{dt} = -V/L## when ##V## is applied across the inductor? To take the simplest example, consider an ideal inductor in series with a battery: the current through the inductor simply increases linearly as ##I(t) = \frac{V}{L}t## in the opposite direction as the applied voltage. How should I understand this behaviour from first principles?