## Power factor question

In AC circuits, it seems that there is no energy used for capacitors and inductors.

I was wondering how is the energy used causing it to increase and decrease the power factor?. And also, why does the power provider have to supply power(apparent power) for something that is ultimately not even used

Thanks..
 PhysOrg.com engineering news on PhysOrg.com >> Mathematical algorithms cut train delays>> Researchers design software to detect changes in colour vision>> Trend study identifies potential for humans and technology to interact in a manufacturing environment
 Mentor Apparent power is not supplied as real power by the generator. The reason the power company gets annoyed at (and charges you for) low power factor is that the capacity of generators and wires is based on amperage, not wattage.
 Thankss.. So, aren't there are generators made based on wattage??.. or can we ignore the apparent power and use a generator to supply for the real power only? And also when the current returns to distribution point.. different cables with be with different power factors.. so will there be any problem caused by that

## Power factor question

Pwr comes to us in three phases, each 120 degrees out of phase with the next. We derive pwr by flowing current between the phases. If the phases change from their ideal 120 degrees, you still have full pwr flowing thru each phase, which is what we get charged for. But we now have slightly less voltage between the phases at each instant in time, so we get less usable pwr between the phases.

Correcting this phase shift normally means adding capacitors. But with expanded use of distributed generation; we can run the DG generators out of phase in the opposite direction to help correct the PF on the grid. This is how generator owners sell VARS to the power company.

Mentor
 Quote by daredevil Thankss.. So, aren't there are generators made based on wattage??.. or can we ignore the apparent power and use a generator to supply for the real power only?
The real power is "real" in the sense that it directly relates to how much fuel the power plant burns. That's why it is the main component of a commercial bill. But how hot a wire or generator gets is a function of amperage, so the worse the power factor the higher the amperage so the larger the wires and generator need to be or the higher the losses will be.

 Similar discussions for: Power factor question Thread Forum Replies Engineering, Comp Sci, & Technology Homework 3 Electrical Engineering 2 Engineering, Comp Sci, & Technology Homework 1 Electrical Engineering 2 Engineering, Comp Sci, & Technology Homework 2