Hello, I've been thinking about why (elementary) transformer efficiency drops drastically at very low frequencies. I know hysteresis effects play a major role in reducing efficiency at high frequencies, but why low? I realise that as we reduce the frequency of the emf, we're making the circuit "more and more DC", but a freqency still exists, no? The transformer that I'm talking about is a simple two-solenoids-magnetically-linked-by-a-ferromagnetic-core single phase one (home-made). Any help would be greatly appreciated.