Granular load factor statistics for the Palo Verde plant

In summary, Unit 1 achieved a LF of 100.4 in 2000, 100.9 in 2009, and 101.0 in 2015. Unit 2 achieved a LF of 101.8 in 1998, 101.2 in 2010, and 101.3 in 2016. Unit 3 achieved a LF of 100.3 in 1999, 102.0 in 2002, and 100.8 in 2015.
  • #1
bgm
2
0
Hello all, first post. I'm interested in granular load factor statistics for the Palo Verde plant, and LF reporting practices in general. Reviewing the LF stats for PV in the PRIS DB, I'm a bit confused. The annual LF in 2002 was 102. PRIS indicates that 102 is a percentage. What I would really like to know is if a LF can be above nameplate rating,and how high LF can go while keeping the reactor(s) safe? How much energy are we wasting when the LF<100?

Context - my profession is M, E, & systems eng. I am interested in load shifting, and I would really like to figure out how much energy is being wasted from our base infrastructure. Loading PV to her limit seems like low-hanging fruit, and a great place to start. I'd really like to know the MWe waste out of PV.

Cheers,

Ben
 
Last edited by a moderator:
Engineering news on Phys.org
  • #2
Please give us a link to your source. It is likely that the source provides the definition of LF that they use.

By the way, I think you mean Palo Verde by PV, but the more common meaning of PV photovoltaic, or solar power.
bgm said:
I'd really like to know the MWe waste out of PV.

Do you mean MWt (thermal) rather than MWe? We usually think that every last MWe produced (minus transmission losses) is consumed, and therefore not wasted. You need to provide your own definition of "waste."
 
  • #3
anorlunda said:
Please give us a link to your source. It is likely that the source provides the definition of LF that they use.
For Palo Verde 1, 2, 3
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=789
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=790
https://pris.iaea.org/PRIS/CountryStatistics/ReactorDetails.aspx?current=791

Load Factor (LF), also called Capacity Factor, for a given period, is the ratio of the energy which the power reactor unit has produced over that period divided by the energy it would have produced at its reference power capacity over that period. Reference energy generation (net) is the energy that...

Unit 3 achieved LF = 102 in 2002. In the same year Unit 1 LF = 89.1, and Unit 2 LF = 92.0

Unit 1 had LF = 100.4 in 2000, 100.9 in 2009, and 101.0 in 2015.
Unit 2 had LF = 101.8 in 1998, 101.2 in 2010, and 101.3 in 2016.
Unit 3 had LF = 100.3 in 1999, 102.0 in 2002, and 100.8 in 2015.

These are probably years in which they did not have refueling or maintenance outages, or periods of reduced power.
 
  • #4
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.

How much is the max? That is not a simple number. The license applications make a case to NRC for a maximum power considering all applicable safety considerations. There are numerous considerations, and simulations are required to assure that the safety criteria are met. If the application is approved, that becomes the new max.

bgm said:
How much energy are we wasting when the LF<100?
I would not call that waste. If your car can drive 120 mph max, but you drive it at a slower speed, is that waste?

But if you insist on calling that waste, then the answer is so trivial I don't understand why you asked. If the LF is 51%, then 49% is "wasted". Remember that power varies through the year, so all these values are yearly averages.

Reactors must be refueled sometime. On years when the plant is shut down for refueling, then 100% LF can not be reached. But it's tricky because some years pass without a refueling shutdown. Is the refueling shutdown waste? Do you consider it waste when your average speed is reduced because you had to stop for gas?

In years past, nuclear power plants were considered "base load" because they were the least expensive option. That means that they made a much power as they could, even if that was less than 100%.

In recent years however, competing forms of power generation have become less expensive than nuclear at certain times. When that happens, the grid operator is obligated to buy power from the least expensive source minute by minute (subject to reliability constraints) and that is not always nuclear.
 
  • Like
Likes russ_watters
  • #5
anorlunda said:
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.
...I agree this is possible, but maybe that first sentence could be put in a less potentially shocking way:

Nuclear plants have poor thermal efficiency and every step in getting energy out incurs losses, so it is possible that the increase in output power actually comes from a decrease in losses due to upgraded or even just originally conservatively rated ancillary equipment.

E.G., install more efficient pumps or transformers and the plant's output power goes up.

I think it is unlikely that the plant is splitting more atoms than originally intended.
 
  • #6
russ_watters said:
I think it is unlikely that the plant is splitting more atoms than originally intended.

Almost all nukes in the Americas and some in Europe have been uprated since commissioning. I think 5% is a typical number. The primary reason is improved analysis tools. Better analysis can calculate safety margins with greater precision. In short, sharper pencils.

An application not only has to include calculations, but also uncertainty estimates. To maintain a required safety margin, you add the uncertainty to the calculated extreme. Reduced uncertainty allows greater max/min operating points. Validation and verification of the models helps provide reduced uncertainty.

See this article about uprating.
http://westinghousenuclear.com/Portals/0/About/Innovation/The Impact of FLEX on Outage Risk.pdf
 
  • #7
The load factor is based on the electrical output (MWe) vs. it's value "at reference ambient conditions." So if the ambient conditions are favorable (low temperatures, for instance, lower the condenser pressure) then the plant electrical output increases with no change in the core power. That's why we sometimes see load factors greater than 100 percent.

The plant license establishes the maximum core power (RTP, "rated thermal power"). The safety analyses are based on the RTP (with a multiplier for measurement uncertainty). The plants are not intentionally operated with the core producing greater than RTP. If the core power is found to exceed RTP, the NRC is swift with penalties and fines.
 
  • #8
gmax137 said:
The plant license establishes the maximum core power (RTP, "rated thermal power"). The safety analyses are based on the RTP (with a multiplier for measurement uncertainty). The plants are not intentionally operated with the core producing greater than RTP. If the core power is found to exceed RTP, the NRC is swift with penalties and fines.

That's true but increasing that RTP is what uprating is all about. As I said in #6, improvements in calculation uncertainty contributes to decreasing that multiplier.
 
  • #9
Hmm I am trying to address the OP's safety concern:
bgm said:
What I would really like to know is if a LF can be above nameplate rating, and how high LF can go while keeping the reactor(s) safe?
You suggested that the higher than 100% load factor was due to uprated power without increasing the reference power:
anorlunda said:
The most likely explanation is that the units in question have had their licences modified to allow more power than their original license. When that happens, in order to not mess up the historical data, the "rated power" is kept at the original value, so that the LF be >100%. However, if we did change rated power, then by definition 100% is the maximum LF.
This is incorrect, as can be seen by following the link provided by astronuc:
Astronuc said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.

The answer to the safety concern is that the plant is never operated above the licensed core power. The instances of Load Factor exceeding 100% are due to the secondary side (turbine generator) performing well, that is, achieving thermal efficiency better than its "reference" value. More electrical output MWe for the same core output MWth.

I am not trying to be argumentative here, I just want to make sure the OP is not left with the idea that the nuclear plants are operated at unsafe power levels.
 
  • Like
Likes russ_watters
  • #10
gmax137 said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.

The answer to the safety concern is that the plant is never operated above the licensed core power. The instances of Load Factor exceeding 100% are due to the secondary side (turbine generator) performing well, that is, achieving thermal efficiency better than its "reference" value. More electrical output MWe for the same core output MWth.

OK, I bow to evidence. I was wrong.
 
  • #11
gmax137 said:
The PRIS data shows both the Load Factor and the reference power (MWe) year-by-year; the reference power value increased in 1997 (coincident with the first 2% uprate) and then again in 2006 (coincident with the subsequent 2.9% uprate). So the instances of greater than 100% load factor are not due to the uprating.
Yes, that is the point I was making. Palo Verde units received approval for the 2% uprates 05/23/96. Palo Verde 2 received approval for 2.9% uprate on 09/29/03, and Palo Verde 1 and 3 received approvals for 2.9% uprates on 11/16/05.
Ref: https://www.nrc.gov/reactors/operat.../status-power-apps/approved-applications.html

By the definition of Load Factor (LF), also called Capacity Factor, for a given period, is the ratio of the energy which the power reactor unit has produced over that period divided by the energy it would have produced at its reference power capacity over that period, it means for the given years, they produced on average more than their reference generation capacity. That can happen if the condenser can reach a lower temperature, e.g., due to exceptionally cold weather. The LF > 100% was not necessarily due to a thermal power increase.

Even without thermal power uprates, plants can realize additional efficiency in their turbines, with better blade designs and improved seals. Siemens introduced an enhanced turbine design back in the 1990s, and several German plants realized significant increases in power conversion efficiency so they could produce additional ~50 MWe or so without increasing thermal power. Siemens did a number of turbine upgrades during the 1990s and early 2000s.

Palo Verde has been limited with respect to uprating due to corrosion of steam generators and fuel. They had some issues many years ago.
 
  • Like
Likes russ_watters and anorlunda
  • #12
A plant is given a guaranteed MW electric output rating when its commissioned or when major maintenance or upgrades alter this rating. The reactor is only rated during licensing activities and is a MW thermal rating.

If your plant is running more efficient than expected, you can exceed 100% rates capacity factor.

I’ve also seen committed capacity factor, where a plant advertises less total yearly output, but exceeds it because they had a shorter maintenance or outage window or something.

This does not mean the reactor is exceeding 100% rtp. That’s illegal for any period of time. In fact, all plants operate slightly below 100% to account for uncertainties in the heat balance equation.
 

1. What is the purpose of granular load factor statistics for the Palo Verde plant?

The purpose of granular load factor statistics is to provide detailed information on the efficiency and utilization of the power plant's equipment and resources. This data helps track the plant's performance and identify areas for improvement.

2. How is the granular load factor calculated?

The granular load factor is calculated by dividing the total energy output of the plant by the maximum possible output over a specific period of time. This calculation takes into account factors such as downtime, maintenance, and changes in demand.

3. What factors can affect the granular load factor at the Palo Verde plant?

There are several factors that can affect the granular load factor at the Palo Verde plant. These include changes in demand for electricity, unplanned downtime due to equipment failures or maintenance, and operational changes such as plant upgrades or modifications.

4. How is the granular load factor data used at the Palo Verde plant?

The granular load factor data is used by plant managers and engineers to monitor and improve the plant's performance. It can also be used for planning and decision-making, such as determining the need for equipment upgrades or scheduling maintenance.

5. How does the granular load factor at the Palo Verde plant compare to other power plants?

The granular load factor at the Palo Verde plant is generally considered to be high compared to other power plants of similar size and type. This is due to the plant's advanced technology and efficient operation, as well as its location in a region with high electricity demand.

Back
Top