I've been idly wondering about this question on and off for many years. In an AC transformer the load on the source side comes from the work it has to do to generate currents in the other side via inductance. I imagine that if the circuit on the other side were suddenly broken, there would be a power surge on the source side because of the dramatically reduced impedance, leading to a burnout or other calamities. Our households are full of transformers in bricks somewhere on the cables that connect laptops and other elecronic devices to the AC mains socket on the wall. Those transformers step down the mains voltage from 110 or 240V to 12W. When the laptop is charging there is a load on the source side of the transformer, but what about when we remove the laptop but leave the transformer connected to the wall socket? Hardly any power is used (my house's power meter confirms this). Why is that? My first guess is that it may have something to do with the fact that the transformer not only steps down the voltage but also rectifies, turning AC into DC. Perhaps that is done in such a way that the source circuit is effectively broken when the device is removed (or when it is fully charged and takes no current). Or is it some dedicated circuitry that effectively breaks the source circuit within the transformer, or imposes a very high resistance to it, when power is not needed to charge the device? I am also curious about how much power such transformers use when they are not charging. I know it's not much, but a couple of dozen such transformers around the house can add up. If I understood what the circuits are doing at that time, I might get a better sense of that. I am grateful for anything anybody can type to educate me on this issue.