AC Motors and Variable Frequency Drives

In summary: The speed is controlled by the frequency of the rotating field. The current draw is going to dependent on the motor construction. Every motor should have some sort of load curve.
  • #1
JonasS
16
0
Hello all. I have a question about AC motors that has been driving me crazy.

I understand how variable frequency drives work, by adjusting the power frequency and thus changing the rpm of the motor based on the equation: rpm = 120 x f / p where f = frequency (Hz) and p = number of motor poles.

What I don't understand is how this increase in motor speed relates to voltage and current. From the equation it doesn't look like they matter.

For example: 3-phase 4-pole AC motor connected to a 15A 110VAC 60Hz power source (US utility power). With no load to cause slip this motor will spin at 1800rpm.

Now put a variable frequency drive in between the motor and the outlet. Then crank the frequency up to 120Hz. The motor will reach 3600rpm. But what effect will this boost in rpm have on the power being drawn from the line?

From what I have read the only thing that changes is the frequency, so you should be drawing no more power than when it was spinning at 1800rpm. But that doesn't make sense, since there is more energy available in the faster spinning motor. Can anyone help me understand this issue?
 
Engineering news on Phys.org
  • #2
The power being drawn from the line depends on the load, so how much the power changes depends on how much the load changes when you change the frequency.

I'm an HVAC engineer and I often use vfds on fans. Fan horsepower is a cube function of rpm, so in my applications, a small increase in rpm makes for a big increase in amperage/wattage.
 
  • #3
Well let us say it is the same load...a fan...for both instances. If the load, which I assume means the torque needed to spin the fan, stays the same at both 60Hz and 120Hz then how can it drain more power?
 
  • #4
Power = Torque x what?

Once you look in your book and find out what 'what' is, then look up for me the relationship between frequency and 'what', and you will have your answer.


Generally speaking, the load is circuit element placed in your network. So the fan is the load, not the torque. The load of the fan will change as the fan draws more or less power depending on the speed of the fan blades, as said by Russ below.

Russ said:
Fan horsepower is a cube function of rpm, so in my applications, a small increase in rpm makes for a big increase in amperage/wattage.

If you have the same load, I don't care what it is or what you call it, its going to use the same power. You can make the load a black box. As long as the value is the same, its effect on the network is going to be the same, period.

I want you to tell me the answer, because you already know it but are not seeing it.
 
Last edited:
  • #5
Power = Torque x rpm

I agree that I am just not getting something. Doesn't the speed at which the fan turns dependent on the speed of the rotor...which is dependent on the AC frequency? Power just determines how quickly the fan will accelerate, right? :confused:

Okay, here is a real world example. A battery pack feeding DC current into a pulse width modulation inverter, which then feeds three-phase AC into a three-phase four-pole AC induction motor.

I set the inverter to an output of 12V at 60Hz. The rotor will accelerate until it reaches synchronous speed, which at 60Hz with four poles is 1800rpm.

Now I add a fan onto the motor shaft. Once again I set the inverter to 12V @ 60Hz. The rotor accelerates but does not reach synchronous speed because of the torque needed to spin the fan, which causes slip. Let's say it reaches 1750rpm.

Same setup, but now I set the inverter to 12V @ 120Hz. The rotor will accelerate up to, say, 3500rpm.

In both instances, at 60Hz and 120Hz, the voltage is the same so the battery pack should provide the same amount of running time. Only in the 120Hz setup the fan will be spinning twice as fast.

Is that right? Am I missing something?
 
  • #6
I think your assumption that at the two frequencies that your running time will be equal is false. The speed is controlled by the frequency of the rotating field. The current draw is going to dependent on the motor construction. Every motor should have some sort of load curve. The equation you reference that is a function of frequency and # of poles assumes the motor is running at rated load.
 
  • #7
JonasS said:
Power = Torque x rpm

I agree that I am just not getting something. [snip] Power just determines how quickly the fan will accelerate, right? :confused:
Where in that equation does it say anything about acceleration? If you disconnect a fan from a motor it does slow down and stop... there is a load on it.

Take a step back and think about it. Its a fan! You know how simple a fan is, you're just thinking too hard. If a fan is spinning slowly, there is almost no air resistance because there is no air movement over the fan. Ie, at low rpm, there is almost no torque required to spin a fan. At higher (but constant) rpm, the air provides a higher, constant torque on the fan.
Doesn't the speed at which the fan turns dependent on the speed of the rotor...which is dependent on the AC frequency?
Right. If you want, you can use different-sized pulleys to make the fan rotor spin at the same rpm while the motor rpm doubles. What happens to your load if your fan rpm hasn't changed...? Has the rpm and torque at the fan changed? What happens to your motor if you gear it down? Has the rpm and torque changed there?
 
  • #8
My question isn't about the fan. My question isn't about gear ratios. My question is about AC motors, and I still do not understand what voltage and current have to do with anything if the speed of the motor is based solely on the frequency of the AC power.

Let's see if I can make the question even more simple. You have two identical setups: a battery pack connected to a PWM inverter, connected to a 4-pole AC motor, connected to a fan. In one setup you set the inverter to output 12V at 60Hz. In the other setup you set the inverter to output 12V at 120Hz.

In the first setup, the fan will spin at 1800rpm minus slip (120x60/4 - slip)...so figure 1750rpm. In the second setup the fan will spin at 3800rpm minus slip (120x120/4 - slip)...so figure 3500rpm.

The second setup will move more air since the fan in spinning faster, but both setups should drain the battery in the exact same time because they are both using 12V. Is this correct?
 
  • #9
I would venture to say that both JonasS and those who have posted after have missed something. I can see where JonasS is coming from since he ASSUMES that the only thing changing when a motor is sped up to double speed is the frequency using 120 hertz instead of 60 hertz.
-
Cyrus: He already knows that power = torque x speed. That's his whole point. He is ASSUMING that the only thing changing is the frequency which is NOT included in the formula. This is why he is confused.
-
Russ: You've fallen into the same thing Cyrus has. Although I won't argue the whole relationship to air movement vs. power drawn.
-
Fred: While the current drawn is dependent upon motor construction, the main thing that the current drawn depends on is the load.
-
JonasS: I'm not sure why you're figuring in acceleration.
-
Here's the deal: The voltage fed into the motor must increase as frequency increases. This is due to the inductance of the motor. In order to force the same current through the motor at double the frequency the voltage has to increase as well because the inductive reactance goes up impeding current flow. The same thing happens in reverse. If we cut the frequency down to 15 hertz the voltage applied to the motor will have to be reduced in order to keep the motor from drawing too much current due to the lowered inductive reactance. You should now see where the extra power comes from when the speed of the motor is doubled.
 
  • #10
JonasS said:
My question isn't about the fan. My question isn't about gear ratios. My question is about AC motors, and I still do not understand what voltage and current have to do with anything if the speed of the motor is based solely on the frequency of the AC power.
But in order to know how the motor reacts, you need to know what the load is doing...
The second setup will move more air since the fan in spinning faster, but both setups should drain the battery in the exact same time because they are both using 12V. Is this correct?
No, it isn't correct. The fact that they are connected to a 12V battery does not tell you anything about how fast they will drain that battery. How fast they drain the battery depends on the amperage they draw from the battery and the amperage depends on impedance of the induction field and the impedance of the induction field depends on the torque that the load is applying to it.
 
  • #11
Averagesupernova said:
I would venture to say that both JonasS and those who have posted after have missed something. I can see where JonasS is coming from since he ASSUMES that the only thing changing when a motor is sped up to double speed is the frequency using 120 hertz instead of 60 hertz.
-
Cyrus: He already knows that power = torque x speed. That's his whole point. He is ASSUMING that the only thing changing is the frequency which is NOT included in the formula. This is why he is confused.
I think you are missing something: speed = frequency. The speed of the motor is a multiple of the frequency in an a/c motor. So saying that the frequency is increasing is equivalent to saying the rpm is increasing. Hopefully, JonasS knows that...
Here's the deal: The voltage fed into the motor must increase as frequency increases. This is due to the inductance of the motor. In order to force the same current through the motor at double the frequency the voltage has to increase as well because the inductive reactance goes up impeding current flow. The same thing happens in reverse. If we cut the frequency down to 15 hertz the voltage applied to the motor will have to be reduced in order to keep the motor from drawing too much current due to the lowered inductive reactance. You should now see where the extra power comes from when the speed of the motor is doubled.
NO! The voltage is supplied by a power source and it is constant. When the frequency is increased, if the load increases, the amperage increases.

I think you - maybe both of you - need to review how an ac motor works: http://en.wikipedia.org/wiki/Electric_motor#AC_motors
In operation, the squirrel cage motor may be viewed as a transformer with a rotating secondary - when the rotor is not rotating in sync with the magnetic field, large rotor currents are induced; the large rotor currents magnetize the rotor and interact with the stator's magnetic fields to bring the rotor into synchronization with the stator's field. An unloaded squirrel cage motor at synchronous speed will consume electrical power only to maintain rotor speed against friction and resistance losses; as the mechanical load increases, so will the electrical load - the electrical load is inherently related to the mechanical load. This is similar to a transformer, where the primary's electrical load is related to the secondary's electrical load.
 
Last edited:
  • #12
It depends on what you're calling the power source russ. I know that the power source the AC drive is connected to feeds it a constant voltage. I don't have time read the link right now but give me about 12 hours and I will. I doubt that the link can tell me a lot I don't know about induction motors already. The amperage cannot increase past the design limit at ANY frequency. So where is the extra power coming from russ? I would bet it's voltage which is reflected in more current draw from the source feeding the drive.
-
Do a little searching on this forum russ. I have at least once explained correctly how induction motors work.
 
  • #13
Trust me, I have read just about everything there is to read about AC motors recently, and I have a very good understanding about how they work.

Problem is, Super seems to be right. I have just found several articles that talk about keeping the voltage/frequency ratio constant as the frequency is ramped up and the motor speed is increased.

While I usually don't trust Wikipedia as a sole source, I even found a few mentions of this that I must have missed in my first pass:

From http://en.wikipedia.org/wiki/Variable_frequency_drive:

"When a VFD starts, the applied frequency and voltage are increased at a controlled rate or ramped up to accelerate the load without drawing excessive current."

I have seen this on several sites, so it seems that while the frequency of the power is used to determine the speed of the rotor, you must increase voltage as you increase frequency.

But is this a "virtual" voltage increase just from the controller? I have read a few articles that make it seem that the controller simulates a higher voltage, and that the actual draw from the line (or battery, in this case) stays the same. I don't know this for sure.

If you do need to increase the voltage as you increase the frequency, how do you calculate what that increase needs to be for a given frequency? I would assume that information would come from the motor, but is it the controller instead?

In my previous example, how do I know how much power the motor is pulling from the battery at a given frequency?
 
Last edited by a moderator:
  • #14
Averagesupernova said:
It depends on what you're calling the power source russ. I know that the power source the AC drive is connected to feeds it a constant voltage.
Huh? If you know that the power source is at constant voltage, then what is there that depends? A typical vfd is fed from a 480v circuit and always feeds 480v to the motor.
I don't have time read the link right now but give me about 12 hours and I will. I doubt that the link can tell me a lot I don't know about induction motors already.
Well...
The amperage cannot increase past the design limit at ANY frequency. So where is the extra power coming from russ?
Huh? No, again. As load is added to the motor (regardless of whether the load is increasing due to increaing the frequency - maybe the pump is just getting clogged with mud), the amperage will increase until the motor physically burns up unless you have safety devices to prevent it. That's what causes motors to fail! [see below]
Do a little searching on this forum russ. I have at least once explained correctly how induction motors work.
Eeek. Then I must have missed it and I'm sorry. You're really, really wrong.

I spent my day today doing startup services on a central chilled and warm water plant at a manufacturing plant. The controls are not in place, so we had to start up the plant manually, opening and closing valves and adjusting the vfds to get the water flow we needed. The VFDs are fed from and feed 480v, 3phase power - the most common type of power for such applications.

The VFDs have data points for the specifications of the motor they are connected to. They monitor the amperage of the motor and will protect it from overamping should something happen to the motor (ie, if the pump clogs and overloads the motor). If that information is entered wrong, you can manually drive the vfd up past 60hz (120 is the typical limit of a drive's capability) and the amerage will increase as a cube function of frequency/rpm in a fan or pump.
 
Last edited:
  • #15
  • #16
russ_watters said:
Huh? If you know that the power source is at constant voltage, then what is there that depends? A typical vfd is fed from a 480v circuit and always feeds 480v to the motor.

My point was defining what you call the source. I think we are in agreement that the 480V 3 phase line feeding the VFD is the source. No argument there. However, my argument is the "...always feeds 480v to the motor..." part. It cannot do that at lower frequencies because the lowered inductive reactance will cause excessive current draw even at light to no load and the way it is remedied is to lower the voltage. I'm pretty sure that the voltage has to change at higher frequencies where the higher inductive reactance impedes current flow to the point that there isn't enough of a magnetic field to turn the rotor. However, I'm NOT saying that you can get 2 HP out of a 1 HP motor simply by doubling the frequency and voltage.

russ_watters said:
Huh? No, again. As load is added to the motor (regardless of whether the load is increasing due to increaing the frequency - maybe the pump is just getting clogged with mud), the amperage will increase until the motor physically burns up unless you have safety devices to prevent it. That's what causes motors to fail! [see below]

You have no argument from me that as the load increases the current will go up. Please do not imply that since in my first post I pointed this out specifically to fred. I probably did not correctly phrase what you quoted. The current can certainly go up but should not be allowed to run past the design limit for any length of time at all. It is not uncommon at all to have motors develop several times their rated horsepower intermittently. In order to do this at frequencies above 60 hertz I don't see how the voltage can remain the same as it is when the motor is running at 60 hertz. It may vary from motor to motor.
 
Last edited:
  • #17
Averagesupernova said:
Fred: While the current drawn is dependent upon motor construction, the main thing that the current drawn depends on is the load.
Obviously. My point was that each motor with the same load will draw differet currents based on their construction.
 
  • #18
http://en.wikipedia.org/wiki/Variable_frequency_drive

Specifically the part that is about half way down the page and around the schematic. They talk about volts per hertz. On a 460 volt motor designed to run at 60 hertz this would be 7.67. For every hertz that the frequency is adjusted the voltage is also adjusted by 7.67 volts. Voltage is reduced to half when the motor is running at 30 hertz for instance. They don't specifically mention increasing the voltage above 460 volts when running above 60 hertz. I suppose it isn't necessary if you simply want to spin something up faster that doesn't require the torque that would otherwise stall the motor at higher frequencies.
 
  • #19
Okay, I have been digging up everything I can find online about this, and while nobody here was a huge help it turns out that averagesupernova was the one who was sort of correct.

It is true that when you use AC motors the ratio between voltage and frequency should be kept at a constant. This "volts per hertz" (V/Hz) is set by comparing the rated voltage and frequency for a specific drive. Most typical AC motors are rated for 460VAC and 60Hz, giving a ratio of 7.67 V/Hz. All this is what supernova put in his post.

There is nothing stopping you from increasing the frequency and leaving the voltage alone. However flux, magnetizing current, and torque are all dependent on this ratio. While increasing frequency without increasing voltage will cause an increase in speed, flux will decrease causing the motor torque to decrease. Magnetizing current will decrease, which will cause a corresponding decrease in stator line current.

AC drives can operate with constant flux from zero frequency up to rated frequency. This is called the "constant torque range" because as long as the V/Hz ratio is maintained the motor will have constant torque. The variable frequency drive unit automatically varies the voltage.

Once you go above the rated frequency, you cannot continue to increase voltage since supply voltage cannot exceed rated voltage. At this point the V/Hz ratio decreases and the motor is said to be in "field weakening". The horsepower stays constant but the torque decreases as frequency increases above the rated level.

The field weakening factor is: Ffw = ( Fr / Fe ) ^2 where Fr = rated frequency and Fe = extended frequency. For instance, you have a 60Hz rated motor and you push it to 90Hz.

(60/90)^2 = .444 = 44%

So your motor will only develop 44% of rated torque at 90Hz.

So to sum up, you do not have to increase voltage when you increase frequency, but you should if you want to maintain constant torque.

What this is telling me is that pushing the motor in this way does not drain the voltage source faster. Going from 60Hz to 120Hz will not drain the battery pack any faster. However, going from 60Hz to 30Hz will not save you any power, since the variable drive unit is what lowers the voltage by using pulse width modulation. The battery feeds a constant voltage and current to the drive unit, and the drive unit synthesizes the new voltage and current to the motor.

If anyone wants to check out some pretty in-depth tutorials on all this, go to the siemens website, www.sea.siemens.com. They have interactive tutorials as well as PDF downloads.
 
  • #20
Okay, a slight change to what I posted above:

What this is telling me is that pushing the motor in this way does not drain the voltage source faster. Going from 60Hz to 120Hz will not drain the battery pack any faster. However, going from 60Hz to 30Hz will not save you any power, since the variable drive unit is what lowers the voltage by using pulse width modulation. The battery feeds a constant voltage and current to the drive unit, and the drive unit synthesizes the new voltage and current to the motor.

This is slightly incorrect. Going slower WILL save you power.

The PWM inverter changes the voltage supplied to the motor by "stuttering" the power...turning the circuit on and off rapidly to adjust the total voltage. So when you are at maximum power the inverter is supplying the full voltage to the motor. As you decrease the power (i.e. let up off the throttle) the inverter starts to "stutter" the circuit. When the circuit is closed there is power flowing to the motor...when the circuit is open there is no power flowing.

I guess a pretty close analogy would be an internal combustion engine that controls engine power not by adjusting the fuel flow rate, but turning the fuel flow on and off rapidly. When the fuel is flowing it is flowing at 100% rate. So to get 100% power you let the fuel flow completely. If you want to get 50% power you turn the flow on and off rapidly so the flow is on for 50% of the time instead of 100%.

So to sum up: to increase the speed of an AC motor you increase the frequency. If you want to keep constant torque you must also increase the voltage supplied to the motor to keep the volts/hertz ratio constant. The more voltage you supply to the motor the faster you drain your battery or other voltage source. Once you hit the maximum supply voltage you can continue to increase frequency, but the volts/hertz ratio will change and you will lose torque while keeping constant horsepower. Going beyond this point will not drain the battery any faster.
 
  • #21
russ_watters said:
The power being drawn from the line depends on the load, so how much the power changes depends on how much the load changes when you change the frequency.

I'm an HVAC engineer and I often use vfds on fans. Fan horsepower is a cube function of rpm, so in my applications, a small increase in rpm makes for a big increase in amperage/wattage.

I also think it depends on the exact technique which is used in VFD.

Of course, in different situation, VFDs play differently.
 

What is an AC motor?

An AC (alternating current) motor is an electric motor that converts electrical energy into mechanical energy through the use of alternating current. It is commonly used in household appliances, industrial machinery, and vehicles.

How does an AC motor work?

An AC motor works by using a magnetic field to create rotational motion. The motor has two main components - a stator and a rotor. The stator is a stationary part that contains the electromagnets, while the rotor is a rotating part that contains the conductors. When an alternating current is passed through the stator, it creates a rotating magnetic field that interacts with the conductors in the rotor, causing it to spin.

What is a variable frequency drive?

A variable frequency drive (VFD) is a type of motor controller that is used to control the speed and torque of an AC motor by adjusting the frequency of the electrical supply. It allows for precise control over the motor's speed and can help reduce energy consumption and improve efficiency.

How does a variable frequency drive work?

A VFD works by converting the incoming AC power into direct current (DC) and then using transistors to convert the DC power back into AC power with a variable frequency. By adjusting the frequency of the AC power, the VFD can control the motor's speed and torque. It also includes additional features such as overload protection and monitoring of motor performance.

What are the benefits of using AC motors and variable frequency drives?

There are several benefits to using AC motors and variable frequency drives. These include improved energy efficiency, reduced maintenance costs, and better control over motor speed and torque. VFDs also offer soft-start capabilities, which reduce the wear and tear on motors and equipment. Additionally, they can help extend the lifespan of the motor and improve overall system performance.

Similar threads

  • Mechanical Engineering
Replies
19
Views
856
Replies
10
Views
3K
Replies
7
Views
986
  • Mechanical Engineering
Replies
20
Views
791
Replies
4
Views
2K
  • Electrical Engineering
Replies
11
Views
2K
Replies
2
Views
633
  • Mechanical Engineering
Replies
3
Views
4K
Replies
10
Views
18K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
Back
Top