Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why won't power utility company charge for kVA instead of kW

  1. May 5, 2017 #1
    Hello all,

    As I understand (and realize when looking at my electricity bill) it seems all power utility company (in residential market) only charge for kW not kVA. But somehow I find it a bit weird when I think about it more deeply.

    Wouldn't it make more sense to charge for kVA ? Otherwise, it seems that I could be drawing an "unlimited" amount of VARs (huge inductive currents for example) assuming my power factor to be zero (or close to) and not be charged a single dime? I understand this is very unlikely in households to have say only inductive or capacitive loads, but for the sake of this discussion, what am I missing in my reasoning? Are they statistically assuming that all households have usually great P.F (close to one) and so figure kW ≅ kVA

    Are kVA meters only installed in industrial markets?

    Thank you in advance for your help.
     
  2. jcsd
  3. May 5, 2017 #2

    russ_watters

    User Avatar

    Staff: Mentor

    You don't get charged for vars because they cost very little to produce - they only affect the equipment/wire sizing. KW/kwh is what determines how big the prime mover needs to be and how much fuel (or wind, sunlight, water) it consumes.
     
  4. May 5, 2017 #3
    Thanks russ,

    When you say "only affect the equipment/wire size", wouldn't that be enough to justify charging me those huge purely inductive current I'm drawing? In theory, that wire size could be infinitely huge... just a thought :)

    My understanding is that they have kVA meters for industrial premises..(TBC) Otherwise what would be the point of correcting the power factor if at the end of the day they only charge me for kwatts I'm using no matter how inefficient my installation is.
     
  5. May 5, 2017 #4

    russ_watters

    User Avatar

    Staff: Mentor

    I suppose, but they only need to pay for wires once. There is also the resulting resistance loss from the higher current, but that is pretty small.

    Thinking about it more, and with your remindet most commercial rates do include a penalty for bad power factor - I believe it is ratiod onto the demand charge. For residential it probably isn't worth the effort to meter.

    Thing is, you can't do anything with a var so there is no incentive to try to draw them.
     
  6. May 5, 2017 #5
    yep, that was my thought.

    If anyone wants to add to this, please feel free.

    Thanks for your time.
     
  7. May 5, 2017 #6

    russ_watters

    User Avatar

    Staff: Mentor

    You're welcome!
     
  8. May 7, 2017 #7

    vk6kro

    User Avatar
    Science Advisor

    Power companies would love to charge for reactive current. Customers would object that they get no value from this so why should they pay for it?

    Imagine you were buying lobsters and the lobster dealer put them in a heavy wooden box before weighing them.

    Lobsters are expensive but wooden boxes are heavy and relatively cheap.
    So, you could insist that the weight of the box be subtracted so that you only pay for the lobsters.
    The boxes can be returned when they are empty.

    Reactive current is like the box. It is necessary to deliver the power you want, but the power company gets it back later in the cycle.
     
  9. May 7, 2017 #8

    rbelli1

    User Avatar
    Gold Member

    But the lobster dealer must purchase and maintain the boxes. If your lobsters require particularly robust (thus expensive) boxes then you are generating less profit for the dealer.

    BoB
     
  10. May 7, 2017 #9
    Large scale industry operates thier own substations with capacitor banks to correct the power factor.
     
  11. May 7, 2017 #10

    jim mcnamara

    User Avatar

    Staff: Mentor

    @anorlunda - could you comment on some of the content here? Thanks. kVARs are commonly shown on large power bills with demand meters.

    FWIW- demand is an ongoing cost of business for distributors, and in New Mexico PNM charges for kVARs. for example. High demand was a deal breaker for San Luis Valley Electric Coop. SLV has a lot of accounts for irrigation and chemigation (potatoes). If every account on a distribution line were to flip the on switch over a small period of time demand costs for the coop would go through the roof. They worked out agreements among users so that there is no overlap in time of use. If demand were a problem everybody on that line would get a huge financial kick in the butt. Works really well.

    If you ever go to Alamosa CO, go to one of the potato coop stores. Wonderful potatoes.
     
  12. May 7, 2017 #11

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    @jim mcnamara , There is merit in what you say.

    There are lots of reasons why VAR billing is unusual. @russ_watters gave the primary one, MWH is more directly tied to costs.
    But I think the other big reason is purely silly.

    If we were to charge the public for imaginary power, then we would become objects of ridicule from every state legislator, every public service commission lawyer, and every pundit, late night comic, and loudmouth on TV. We would be given Golden Fleece awards. Imagine the chaos in the billing department if one of every 1000 ratepayers, sent us a check for imaginary money to pay for his imaginary power. Ha ha LOL.

    With "large bills", we deal with engineers and plant managers, who won't treat it like a big joke. We can negotiate VARs as part of the interconnection agreement. The politicians and TV comics don't read or understand these contracts, so it doesn't get silly.

    But with my engineer's hat on, and silliness aside, if I charge money for KVAR, then customers feel entitled to use what they want as long as they pay. I would rather put in a relay with a hair trigger to trip loads using excessive KVAR than to send a big bill. In fact to invent a different kind of silliness, it would be nice to shift the burden and expense of voltage control to customers. In the past, that could never happen, but in tomorrow's world with aggregation of homeowner/community owned distributed generation (I'm thinking of NY's REV) , it could become a part of the interconnection agreement.
     
  13. May 8, 2017 #12

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    My opinion? It's just tradition.

    Utilities grew out of the depression when "square dealing" was the norm.
    Florida Power and Light started as an Ice company selling a little bit of their internally generated electricity to folks who lived close to the ice plants.
    .
    Most of a tiny 1930-ish utility's revenue stemmed from conversion of energy from coal or oil or even sawdust into kilowatt-hours and that's what they sold. It would have been deemed unconscionable to charge one's neighbors for something that cost nothing to produce.

    I had the good fortune to know some of those "old timers" . Think "Old fashioned value system"..........

    old jim
     
  14. May 8, 2017 #13
    In general Residential customers can only be billed for Power, not VARS by law. Large industrial customers can and do get billed a VAR surecharge, ether for VAR demand or KVARH - or both.

    If you has very high VAR demand at you home that does not help you in any way, any useful purpose would result in real power, Watts. If for example you had a large inductor on the line, you are just creating and collapsing a magnetic field, as soon as you try to use that field you have real power used. Also - the efficiency, or heat losses from the current in the conductor you will be billed for, since this also is REAL power. So there is some motivation to use more efficient appliances.
     
  15. May 8, 2017 #14
    Can we put an estimate to those increased system losses?

    I recall discussing this with people about how not all of the residential $ savings (based on watts) in a CFL translated to lower fossil fuel usage at the power plant, because of the power factor.

    I've heard an estimate of ~ 8% average losses in the national grid (North America). I'm not sure how much of that is IR losses, I guess some is transformer conversion losses, but I don't know how much. But if a CFL has a PF of 0.5, it will draw 1.414 x the current, and losses are I^2 x R, so 2 times the loss (of some of that 8%). But I also wonder about the generation side. I don't know the efficiency of large alternators, but I'd assume mid-high 90's? How much of that loss is from current? So those losses might be nearly doubled as well?

    I'd also heard the argument that CFLs are a capacitive PF, while our homes are full of inductive PF (motors), so the CFLs were actually helping to normalize the grid. But I reasoned if that were true, there would be no motivation to design CFLs with PF closer to one, and I recall a push for this. The same applies to LEDs, I assume their switching circuits are capacitive as well to store energy between full-wave rectified peaks to reduce flicker.
     
  16. May 8, 2017 #15

    russ_watters

    User Avatar

    Staff: Mentor

    Maybe when I can put some more thought into it, but I do have a short answer:
    I've never heard of anyone considering the impact on grid power loss when doing the change-over from inc to cfl, so before you add back in the loss due to the reactive current, you first have to subtract the savings due to the lower active current!

    Suffice to say, once you include both, you'll find the net savings has gone up, not down.
    No, they are inductive.
     
  17. May 8, 2017 #16
    Sure, they save overall, I was just saying they don't save as much as the residential $ savings would indicate. Due to these losses,
     
  18. May 8, 2017 #17

    russ_watters

    User Avatar

    Staff: Mentor

    Let me try again: if a CFL uses 20% as much active power as the incandescent it replaced, the savings at the power plant is greater, not less, once the power line loss is included.
     
  19. May 8, 2017 #18
    Of course.

    My point is that if someone buys a CFL/LED that uses 20% the watts of a comparable incandescent (but has a PF ~ 0.5), they may assume that the fossil fuel plant uses 20% of the fuel to power that CFL/LED versus a filament bulb. I'm trying to estimate the additional losses, to say that maybe it takes 25%, 30% or ? Sure, it still makes good sense to make the change, but just how much, from a supply viewpoint?
     
  20. May 8, 2017 #19

    russ_watters

    User Avatar

    Staff: Mentor

    Still not getting it. If the CFL draws 20% (btw, bad estimate on my part; it is more like 25%...), the power plant sees it drawing less than 20% as much, not more than 20% as much. It's because:
    1. The line loss is on top: the 100W Inc draws 108W from the power plant.
    2. The loss is a square function, not a linear proportion.
     
  21. May 8, 2017 #20

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    Spot on. You hit the nail on the head Russ. And it goes double when switching to LEDs. Some utility engineers wrung their hands about harmonics because LED controllers are non-linear. But when it was pointed out how much they reduce the real power, the harmonics worries dissapeared.

    @NTL2009 you're trying too hard. The kinds of losses you're talking about are small potatoes; too small to worry about.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Why won't power utility company charge for kVA instead of kW
Loading...