- #1
Physicist3
- 104
- 0
Hi, I am currently simulating a 3 phase induction motor system. One of the tests that I have run is to keep the motor parameters (reactance, resistance etc.) constant, and then change the frequency of the supply (50, 40, 30 Hz etc.) before measuring input parameters (current and power drawn from supply), along with output parameters such as torque and rotational speed. Throughout, I have kept the value of the voltage supply constant. Having completed the tests, I have found that when the frequency is dropped from 50 to 40, and from 40 to 30 Hz, both the current and power drawn from the supply decrease. However, when dropping the frequency below 30Hz, I have found that the current drawn from the supply increases again, yet the power (watts) continues to decrease with a constant voltage. I don't quite understand how this is possible, would it be the result of a really poor power factor at low frequency as the motor begins to stall and inductive parameters become more influential on the system?