- #1
Vagabond7
- 50
- 11
Hi, I am an EE student that has been helping in my professor's lab. We have a Double Fed Induction Generator set up and it is rotated by a DC motor. It is used for experiments. We have the whole set up running through a controller with a graphical user interface where you can adjust various parameters and see graphs in real time of different things.
As anybody here probably knows, a DFIG has windings on both the stator and rotor. I can control the voltage applied to the rotor windings from a computer. When the generator shaft is rotating, if I increase the rotor voltage, it increases the stator's output voltage. Why? I don't really understand what is going on inside the machine. I can kind of understand how I can control the stator frequency by changing the rotor frequency, but I don't understand why there is such a drastic difference in stator voltage. At 1600rpm, I can get 3.9V rms from the stator if I apply 3 volts to the rotor, or I can get 14.6V rms if I apply 7 volts to the rotor. That is a huge difference.
What is physically happening here?
As anybody here probably knows, a DFIG has windings on both the stator and rotor. I can control the voltage applied to the rotor windings from a computer. When the generator shaft is rotating, if I increase the rotor voltage, it increases the stator's output voltage. Why? I don't really understand what is going on inside the machine. I can kind of understand how I can control the stator frequency by changing the rotor frequency, but I don't understand why there is such a drastic difference in stator voltage. At 1600rpm, I can get 3.9V rms from the stator if I apply 3 volts to the rotor, or I can get 14.6V rms if I apply 7 volts to the rotor. That is a huge difference.
What is physically happening here?