Granular load factor statistics for the Palo Verde plant

Click For Summary

Discussion Overview

The discussion revolves around granular load factor (LF) statistics for the Palo Verde nuclear plant, focusing on the implications of LF values exceeding 100%, the definitions of LF, and the potential energy waste associated with LF values below 100%. Participants explore the technical aspects of LF reporting practices, safety considerations, and the impact of operational changes on load factors.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants express confusion about the meaning of an LF greater than 100% and whether it indicates that the reactor is operating above its nameplate rating.
  • Others clarify that LF is defined as the ratio of actual energy produced to the energy that would have been produced at reference capacity, suggesting that favorable ambient conditions can lead to LF values exceeding 100% without exceeding core power limits.
  • A participant notes that modifications to licenses may allow for increased power output while maintaining historical data integrity, which could explain LF values above 100%.
  • Concerns are raised about the definition of "waste" when LF is below 100%, with some arguing that not all energy not produced should be considered waste.
  • Some participants discuss the implications of uprating, suggesting that improvements in analysis tools and reduced uncertainty in safety margins can lead to higher maximum operating points.
  • There is mention of regulatory oversight by the NRC, emphasizing that exceeding rated thermal power (RTP) is not permitted and could result in penalties.
  • One participant questions the safety implications of operating at LF values above nameplate ratings and seeks clarity on how high LF can go while ensuring reactor safety.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the interpretation of LF values above 100%, the definition of energy waste, or the implications of uprating. Multiple competing views remain regarding the safety and operational practices of the Palo Verde plant.

Contextual Notes

Participants highlight the complexity of defining load factor and energy waste, as well as the influence of operational practices and regulatory frameworks on reported statistics. The discussion reflects varying assumptions about the definitions and implications of LF in the context of nuclear power generation.

bgm
Messages
2
Reaction score
0
Hello all, first post. I'm interested in granular load factor statistics for the Palo Verde plant, and LF reporting practices in general. Reviewing the LF stats for PV in the PRIS DB, I'm a bit confused. The annual LF in 2002 was 102. PRIS indicates that 102 is a percentage. What I would really like to know is if a LF can be above nameplate rating,and how high LF can go while keeping the reactor(s) safe? How much energy are we wasting when the LF<100?

Context - my profession is M, E, & systems eng. I am interested in load shifting, and I would really like to figure out how much energy is being wasted from our base infrastructure. Loading PV to her limit seems like low-hanging fruit, and a great place to start. I'd really like to know the MWe waste out of PV.

Cheers,

Ben
 
Last edited by a moderator:
Engineering news on Phys.org
Please give us a link to your source. It is likely that the source provides the definition of LF that they use.

By the way, I think you mean Palo Verde by PV, but the more common meaning of PV photovoltaic, or solar power.
bgm said:
I'd really like to know the MWe waste out of PV.

Do you mean MWt (thermal) rather than MWe? We usually think that every last MWe produced (minus transmission losses) is consumed, and therefore not wasted. You need to provide your own definition of "waste."
 
anorlunda said:
Please give us a link to your source. It is likely that the source provides the definition of LF that they use.
For Palo Verde 1, 2, 3
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=789
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=790
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=791

Load Factor (LF), also called Capacity Factor, for a given period, is the ratio of the energy which the power reactor unit has produced over that period divided by the energy it would have produced at its reference power capacity over that period. Reference energy generation (net) is the energy that...

Unit 3 achieved LF = 102 in 2002. In the same year Unit 1 LF = 89.1, and Unit 2 LF = 92.0

Unit 1 had LF = 100.4 in 2000, 100.9 in 2009, and 101.0 in 2015.
Unit 2 had LF = 101.8 in 1998, 101.2 in 2010, and 101.3 in 2016.
Unit 3 had LF = 100.3 in 1999, 102.0 in 2002, and 100.8 in 2015.

These are probably years in which they did not have refueling or maintenance outages, or periods of reduced power.
 
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.

How much is the max? That is not a simple number. The license applications make a case to NRC for a maximum power considering all applicable safety considerations. There are numerous considerations, and simulations are required to assure that the safety criteria are met. If the application is approved, that becomes the new max.

bgm said:
How much energy are we wasting when the LF<100?
I would not call that waste. If your car can drive 120 mph max, but you drive it at a slower speed, is that waste?

But if you insist on calling that waste, then the answer is so trivial I don't understand why you asked. If the LF is 51%, then 49% is "wasted". Remember that power varies through the year, so all these values are yearly averages.

Reactors must be refueled sometime. On years when the plant is shut down for refueling, then 100% LF can not be reached. But it's tricky because some years pass without a refueling shutdown. Is the refueling shutdown waste? Do you consider it waste when your average speed is reduced because you had to stop for gas?

In years past, nuclear power plants were considered "base load" because they were the least expensive option. That means that they made a much power as they could, even if that was less than 100%.

In recent years however, competing forms of power generation have become less expensive than nuclear at certain times. When that happens, the grid operator is obligated to buy power from the least expensive source minute by minute (subject to reliability constraints) and that is not always nuclear.
 
  • Like
Likes   Reactions: russ_watters
anorlunda said:
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.
...I agree this is possible, but maybe that first sentence could be put in a less potentially shocking way:

Nuclear plants have poor thermal efficiency and every step in getting energy out incurs losses, so it is possible that the increase in output power actually comes from a decrease in losses due to upgraded or even just originally conservatively rated ancillary equipment.

E.G., install more efficient pumps or transformers and the plant's output power goes up.

I think it is unlikely that the plant is splitting more atoms than originally intended.
 
russ_watters said:
I think it is unlikely that the plant is splitting more atoms than originally intended.

Almost all nukes in the Americas and some in Europe have been uprated since commissioning. I think 5% is a typical number. The primary reason is improved analysis tools. Better analysis can calculate safety margins with greater precision. In short, sharper pencils.

An application not only has to include calculations, but also uncertainty estimates. To maintain a required safety margin, you add the uncertainty to the calculated extreme. Reduced uncertainty allows greater max/min operating points. Validation and verification of the models helps provide reduced uncertainty.

See this article about uprating.
http://westinghousenuclear.com/Portals/0/About/Innovation/The Impact of FLEX on Outage Risk.pdf
 
The load factor is based on the electrical output (MWe) vs. it's value "at reference ambient conditions." So if the ambient conditions are favorable (low temperatures, for instance, lower the condenser pressure) then the plant electrical output increases with no change in the core power. That's why we sometimes see load factors greater than 100 percent.

The plant license establishes the maximum core power (RTP, "rated thermal power"). The safety analyses are based on the RTP (with a multiplier for measurement uncertainty). The plants are not intentionally operated with the core producing greater than RTP. If the core power is found to exceed RTP, the NRC is swift with penalties and fines.
 
gmax137 said:
The plant license establishes the maximum core power (RTP, "rated thermal power"). The safety analyses are based on the RTP (with a multiplier for measurement uncertainty). The plants are not intentionally operated with the core producing greater than RTP. If the core power is found to exceed RTP, the NRC is swift with penalties and fines.

That's true but increasing that RTP is what uprating is all about. As I said in #6, improvements in calculation uncertainty contributes to decreasing that multiplier.
 
Hmm I am trying to address the OP's safety concern:
bgm said:
What I would really like to know is if a LF can be above nameplate rating, and how high LF can go while keeping the reactor(s) safe?
You suggested that the higher than 100% load factor was due to uprated power without increasing the reference power:
anorlunda said:
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.
This is incorrect, as can be seen by following the link provided by astronuc:
Astronuc said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.

The answer to the safety concern is that the plant is never operated above the licensed core power. The instances of Load Factor exceeding 100% are due to the secondary side (turbine generator) performing well, that is, achieving thermal efficiency better than its "reference" value. More electrical output MWe for the same core output MWth.

I am not trying to be argumentative here, I just want to make sure the OP is not left with the idea that the nuclear plants are operated at unsafe power levels.
 
  • Like
Likes   Reactions: russ_watters
  • #10
gmax137 said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.

The answer to the safety concern is that the plant is never operated above the licensed core power. The instances of Load Factor exceeding 100% are due to the secondary side (turbine generator) performing well, that is, achieving thermal efficiency better than its "reference" value. More electrical output MWe for the same core output MWth.

OK, I bow to evidence. I was wrong.
 
  • #11
gmax137 said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.
Yes, that is the point I was making. Palo Verde units received approval for the 2% uprates 05/23/96. Palo Verde 2 received approval for 2.9% uprate on 09/29/03, and Palo Verde 1 and 3 received approvals for 2.9% uprates on 11/16/05.
Ref: https://www.nrc.gov/reactors/operat.../status-power-apps/approved-applications.html

By the definition of Load Factor (LF), also called Capacity Factor, for a given period, is the ratio of the energy which the power reactor unit has produced over that period divided by the energy it would have produced at its reference power capacity over that period, it means for the given years, they produced on average more than their reference generation capacity. That can happen if the condenser can reach a lower temperature, e.g., due to exceptionally cold weather. The LF > 100% was not necessarily due to a thermal power increase.

Even without thermal power uprates, plants can realize additional efficiency in their turbines, with better blade designs and improved seals. Siemens introduced an enhanced turbine design back in the 1990s, and several German plants realized significant increases in power conversion efficiency so they could produce additional ~50 MWe or so without increasing thermal power. Siemens did a number of turbine upgrades during the 1990s and early 2000s.

Palo Verde has been limited with respect to uprating due to corrosion of steam generators and fuel. They had some issues many years ago.
 
  • Like
Likes   Reactions: russ_watters and anorlunda
  • #12
A plant is given a guaranteed MW electric output rating when its commissioned or when major maintenance or upgrades alter this rating. The reactor is only rated during licensing activities and is a MW thermal rating.

If your plant is running more efficient than expected, you can exceed 100% rates capacity factor.

I’ve also seen committed capacity factor, where a plant advertises less total yearly output, but exceeds it because they had a shorter maintenance or outage window or something.

This does not mean the reactor is exceeding 100% rtp. That’s illegal for any period of time. In fact, all plants operate slightly below 100% to account for uncertainties in the heat balance equation.