Hello all. I have a question about AC motors that has been driving me crazy. I understand how variable frequency drives work, by adjusting the power frequency and thus changing the rpm of the motor based on the equation: rpm = 120 x f / p where f = frequency (Hz) and p = number of motor poles. What I don't understand is how this increase in motor speed relates to voltage and current. From the equation it doesn't look like they matter. For example: 3-phase 4-pole AC motor connected to a 15A 110VAC 60Hz power source (US utility power). With no load to cause slip this motor will spin at 1800rpm. Now put a variable frequency drive in between the motor and the outlet. Then crank the frequency up to 120Hz. The motor will reach 3600rpm. But what effect will this boost in rpm have on the power being drawn from the line? From what I have read the only thing that changes is the frequency, so you should be drawing no more power than when it was spinning at 1800rpm. But that doesn't make sense, since there is more energy available in the faster spinning motor. Can anyone help me understand this issue?