Hi I have been researching pulsing Electromagnets (inductors) and I have run into a problem. I have been using this formula for figuring out how fast a current will rise if powered with a ceratin volatage and the coil inductance. L (uh> / V = Time to 1 amp ( in MicroSecond) According to this which I found seems to be accurate, says that if you want a current to rise to 1 A in say 1 microsecond on a inductor of inductance 10 mH, then the voltage required would be 10,000 Volts. 10,000 (uH)/ 1 (microseconds) = 10,000 Volts That seems like awful lot of voltage and it get's even more outrageous if you plot in larger inductors and smaller time-frames 1,000,000 (uH) / .001 (microseconds = 1,000,000,000 Volts Doesn't that sound too high, 1 Billion Volts just to pulse a 1 H coil to 1 Amp in 1 Nanosecond, ? If you were to repeat that pulse a 500 million times a second, then I think you would be using 500 Million Watts of power!! since the duty cycle would 50%. I'm just going to go ahead and assume something is wrong here like always, lol. So can someone please explain too me why this is wrong, if it is. If it is not wrong then why does it take so much voltage which usually also means high power since the current here is 1 amp and it's not just one pulse but many over a second.