1. The problem statement, all variables and given/known data Hi, so during an experiment, I spun a magnet inside a coil of wire in order to induce an emf, and measure the input power and output power. One of the things I found was that the slower the magnet spun (ie the smaller the input voltage to the motor) the larger the induced voltage and the greater the output power from the coil. Here is some of my data; 9.98v and 0.25A -> 0.06V and 0.04µA 4.95v and 0.15A -> 0.08V and 0.01µA 6.01v and 0.15A -> 0.11V and 0.15 µA This was a general trend I found when changing variables such as the number of turns in the coil and the strength of the magnet. All my tests showed this trend. According to the Faraday Law and the equation for induced emf (E=Blv where B is the magnetic flux density, v the velocity of the coil/magnet and l the length of wire) the faster the magnet is spinning the higher the expected emf and therefore voltage. Can anyone think of any reasons why this might be happening? 2. Relevant equations E=Blv P=IV 3. The attempt at a solution I thought maybe the skin effect, where resistance increases with higher AC frequencies, might have something to do with it but it requires frequencies much much higher than the ones I am using to have any effect.