How Does Low Voltage Affect Induction Motor Performance?

Click For Summary

Discussion Overview

The discussion revolves around the effects of operating an induction motor at a lower voltage and higher frequency than its rated specifications. Participants explore various performance factors, including torque, starting issues, and the implications of voltage-to-frequency (V/f) ratios in variable frequency drive (VFD) applications.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant notes that running a 1HP induction motor at 185V and 65Hz may lead to starting issues and reduced torque, regardless of whether it is single-phase or three-phase.
  • Another participant suggests that operating at lower voltage could cause the motor to run warmer, although they believe it may not be a significant concern.
  • A participant provides calculations indicating that at the lower voltage, the motor would operate at approximately 74% flux density, achieving about 60% peak torque and as low as 54% starting torque.
  • One participant expresses confusion about the popularity of constant V/f control, questioning how the voltage and frequency scaling affects motor performance, particularly when the inverter's output voltage is lower than the supply voltage.
  • Another participant emphasizes the importance of maintaining the V/f ratio to ensure the motor's rated torque is preserved while adjusting speed.
  • A later reply requests clarification on the calculations related to torque changes when operating a different induction motor at varying voltage and frequency.

Areas of Agreement / Disagreement

Participants present multiple viewpoints regarding the effects of lower voltage on motor performance, with some agreeing on the potential for reduced torque and starting issues, while others raise questions about the implications of V/f control. The discussion remains unresolved with no consensus on the best approach or understanding of the V/f relationship.

Contextual Notes

Participants reference calculations and assumptions regarding flux density and torque without providing detailed methodologies, leaving some aspects of the discussion open to interpretation and further exploration.

Who May Find This Useful

Individuals interested in induction motor performance, variable frequency drives, and electrical engineering applications may find this discussion relevant.

manche
Messages
2
Reaction score
0
I have got an 1HP induction motor rated at 230V(line), 60Hz. If I run it at 185V(line) at 65Hz, what factors are affected in its performance?

I am hoping for your expert advices..
 
Engineering news on Phys.org
If this is a single phase motor it may have starting issues and will have less torque in general regardless of single or 3 phase. I believe wiki has some good stuff on variable frequency drives. You will likely get some good ideas from there.
 
At this lower voltage, you might start to see the motor getting a little warmer than if it were running at the rated voltage. It's probably not going to be enough difference to cause too much concern however I would want to try it to see. What is your application?
For induction motors, as Averagesupernova says, it might have a little trouble starting. Again you can try it to see.
Here's a Wikipedia page explaining how slip effects induction motor speed.
http://en.wikipedia.org/wiki/Induction_motor"

Welcome to PF
 
Last edited by a moderator:
By my cacluations you'll be running about 74% flux density, get about 60% peak torque and as little as 54% starting torque.
 
I see that lower voltage means not so good performance.

My confusion is, if that is the case, why constant v-f is so popular. I will make my confusion clearer.
For V-f, inverter is used to generate three phases. Inverter is fed by DC bus voltage which itself is rectified from supply voltage.
Here is the problem,(step by step)
-if supply voltage is 208V line,dc bus is 294V. (using three phase bridge rectifier)
-Since max. phase voltage (from inverter) is only 0.5Vdc (about 150Vdc)
-So,the motor driven at 208V directly is now only driven at sqrt(3)*150V/sqrt(2)=184Vrms
line voltage.
-Since all other V's and f's for const v-f is scaled from 184Vrms, all other values are skewed
in their performances too.

I went through different VFD for v-f, and almost all uses same scheme. Am I missing something here?
 
I'm not sure what you are asking here, but the general rule is to maintain a voltage to frequency ratio (V/f) to maintain the motors rated torque (230/60 in this case). So you can speed up or slow down the motor while keeping the torque constant. If you want to decrease its toruqe, reduce the V/f ratio.
http://en.wikipedia.org/wiki/Variable_Frequency_Drive"
 
Last edited by a moderator:
uart said:
By my cacluations you'll be running about 74% flux density, get about 60% peak torque and as little as 54% starting torque.

Can you please explain or show your calculations for this? I have an induction motor that is nominally 400V 50Hz and I would like to know how its torque changes when running at 400V 60Hz. Thanks.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 6 ·
Replies
6
Views
4K
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
9K
Replies
2
Views
2K
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 22 ·
Replies
22
Views
4K