What are the disadvantages of having low power factor?

AI Thread Summary
Low power factor leads to increased costs for utility companies, as they must generate more volt-amperes than the real power consumed, resulting in higher generation and transmission expenses. For instance, a power factor of 0.7 means the apparent power is 1.4 times the real power, causing the line current to also increase by 1.4 times. This increased current doubles circuit losses due to their proportional relationship with the square of the current. Additionally, all system components, including generators and transformers, must be larger and more expensive to handle the extra current. Understanding these disadvantages is crucial for efficient energy management and cost reduction.
sirsajid
Messages
13
Reaction score
0
What are the disadvantages of having low power factor?
 
Engineering news on Phys.org
This is from the excellent article in Wikipedia about Power Factor:

The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts.
Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts).

This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load.

Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current).

Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current.

You can read the rest of it here:
http://en.wikipedia.org/wiki/Power_factor
 
vk6kro said:
This is from the excellent article in Wikipedia about Power Factor:

The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts.
Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts).

This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load.

Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current).

Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current.

You can read the rest of it here:
http://en.wikipedia.org/wiki/Power_factor
Thank you very much for your answer. I have got it know.
Could you tell me the difference between switchgear and circuit breaker?
 
Switchgear is the equipment that is used for normal operation of equipment. You turn it on. You turn it off.

Circuit breakers are for overload conditions where too much current is being drawn by a circuit and the power is removed to avoid further damage or fires.

There is some overlap in these. A circuit breaker is commonly incorporated into each power switch so that the power switch is turned off by the circuit breaker.
 
Next homework question please!
 
Couldn't be homework, because that isn't allowed here. :smile:

I'd rather see someone learn something than be too dogmatic about it. This one just learned about Wikipedia.
 
Averagesupernova said:
Next homework question please!
What is the optimum generation voltage?what are the different voltage levels from the generation to the consumer, I mean generation...transmission ...sub transmission...distribution..what are voltage levels in each stage...
 
Back
Top