So I've been working with variable frequency drives recently and am curious about what they're actually controlling: motors. So I'm curious how current and torque are related. So obviously torque is a force that causes something to rotate around an axis, therefore when the motor speed is controlled to increase or decrease, the torque should temporarily go up but then drop off to an extent once its settles on a speed? I'm also curious as to how voltage and frequency affect the speed of motor. Say its a 3 phase AC motor, voltage will have both a magnitude and frequency. I want someone to correct me if I'm wrong. Frequency in essence will control the speed of the motor to an extent. However, if you want the motor to go fast, you're also going to have to ramp up the voltage magnitude to keep the motor going to stay in sync with the frequency? Is that an incorrect assumption? And lastly, if the motor reaches some resistance, this will decrease the power outputted from the motor, but why does the drawn current also decrease?