# Why does the frequency on a power grid increase when generation exceeds load?

• illini1022
In summary, the frequency of a power grid is dependent on the angular velocity of the generators, which is carefully controlled to match the standard frequency used for timing functions. If load exceeds generation, the voltage may drop, but the frequency remains controlled. Utilities often have a spinning reserve in frequency control mode to respond to frequency changes caused by imbalances in generation and load. In steady state, power produced is equal to load demand, but during transients, excess energy is absorbed in the rotating machine train and converted to electrical energy.
illini1022
I could rephrase the same question, why does the frequency decrease when load exceeds generation?

I know these are fundamental operating characteristics of a power system network but I'm not sure I can mathematically prove it. My best guess is that everything dissipates more power when the frequency increases, or less power when the frequency decreases. So if you have an imbalance of generation/load, the change in frequency effectively forces the load to dissipate more power or consume less power, right?

Which equation explicitly shows this? The AC instantaneous current formula?

I'm familiar with frequency control schemes, etc, I just want to make sure I understand the mathematical concepts behind it.

Thanks

The frequency of a power grid depends upon the angular velocity of the generators. When there is more load it becomes harder to turn the generators at the same frequency. However, the power line frequency is used for many types of timing functions so the frequency is very carefully controlled and does not vary substantially from the standard frequency.

If load were to exceed generation, the voltage would drop but the frequency wouldn't change appreciably.

skeptic2 said:
The frequency of a power grid depends upon the angular velocity of the generators. When there is more load it becomes harder to turn the generators at the same frequency. However, the power line frequency is used for many types of timing functions so the frequency is very carefully controlled and does not vary substantially from the standard frequency.

If load were to exceed generation, the voltage would drop but the frequency wouldn't change appreciably.
Are you certain about that? From my (limited) power systems experience I was under the impression that if load exceeds generation the frequency will indeed begin to drop. (Voltage will drop as well just due to the increased voltage drop from higher currents).

Doesn't a typical utility have a spinning reserve which operates in frequency control mode? It monitors the frequency and checks if the frequency is dropping which means generation needs to be increased. I remember testing this exact concept in my EE lab by putting a generator in frequency droop control mode and overloading the network...

Unless you actually maxed out all the generators in the country, load does not excede generation but what does happen is the speed set point of the generators changes with respect to load. You need to understand how droop works.

cabraham
Jobrag said:
Unless you actually maxed out all the generators in the country, load does not excede generation but what does happen is the speed set point of the generators changes with respect to load. You need to understand how droop works.

I don't mean potential load exceeding potential generation.

I mean what happens if the instantaneous power produced on some pretend power grid was 100MW. The instantaneous demand at this time is also 100MW.

All of a sudden a large section of load is tripped offline and demand drops to 99MW. Does the frequency not increase? And then the spinning reserve responds to this increase in frequency by scaling down its power output?

Such imbalances are removed by requesting generators to operate in so called frequency response mode (also called frequency control mode), altering their output continuously to keep the frequency near the required value.
The grid frequency is a system-wide indicator of overall power imbalance. For example, it will drop if there is too much demand because generators will start to slow down slightly. A generator in frequency-response mode will, under nominal conditions, run at reduced output in order to maintain a buffer of spare capacity. It will then continually alter its output on a second-to-second basis to the needs of the grid with droop speed control.

http://en.wikipedia.org/wiki/Dynamic_demand_(electric_power )

Last edited by a moderator:
In steady operation the power a grid produces is the same as the load demand, if you are running a small grid a with 90 MW demand then the amount generated must also be 90 MW, If you then shed a chunk of that you go into a transient situation. There is a delay between the governor sensing a loss of load and the fuel valve closing to compensate, for a brief time more fuel is going into the prime mover then is required the excess is absorbed as rotating energy in the machine train i.e the generator spins faster and absorbs more mechanical energy, the governor will over compensate and the excess mechanical energy will be converted to electrical energy.

Jobrag said:
In steady operation the power a grid produces is the same as the load demand, if you are running a small grid a with 90 MW demand then the amount generated must also be 90 MW, If you then shed a chunk of that you go into a transient situation. There is a delay between the governor sensing a loss of load and the fuel valve closing to compensate, for a brief time more fuel is going into the prime mover then is required the excess is absorbed as rotating energy in the machine train i.e the generator spins faster and absorbs more mechanical energy, the governor will over compensate and the excess mechanical energy will be converted to electrical energy.

So during that brief time, the generator spins faster - effectively increasing the frequency, correct? And the mechanism by which the governor senses a loss of load is by monitoring that frequency?

Yes and Yes

So to rehash this one more time.

Pin=Pout is really what I want to understand. At steady state let's say Pin = 100MW and Pout = 100MW. Now 1MW of load is lost. Examining this transient period of time - Pin is still 100MW because your generator has not had time to react.

Pout is still 100MW because of the conservation of energy we just spoke of (Pin=Pout). So my question really was, where does this extra 1MW of power get dissipated during this transient period? Let's assume there is no energy storage or flywheels.

There are only two places I can see where this extra energy is dissipated. It would either be A) in the mechanical generating turbines themselves, or B) the overall connected load helps compensate, or C) some combination of A and B.

I was thinking - when the frequency increases, the inductive reactance of a connected load increases as well, right? So does the connected grid dissipate slightly more power at 61 Hz as opposed to 60 Hz? Or does all of the extra power get burned up as mechanical energy?

A while back, I was talking to the operators at the Mt. Elbert 200-MW pumped-storage hydropower facility in Colorado, and they mentioned to me that during the 1987 earthquake in the Los Angeles (California) area, a significant portion of that city dropped off the interstate electrical grid, causing a transient in the spinning generator(s) at Mt. Elbert, which are being driven by the water flow in the penstocks. The generator(s) absorbed the power transient without tripping, but the synchronous phase of the generator armatures did change momentarily, because of the sudden loss of load. At first, the operators didn't know if L.A just dropped of the grid, or fell into the ocean.

Bob S

Let's assume there is no energy storage or flywheels.

Actually the generators have a lot of rotating inertia and a significant part of the load is rotating machinery - refrigeration motors and the like.

From the control room of a power plant here's what we see -

In the utility's dispatch office is a huge mimic display of the system .
Power flow is monitored everywhere that big lines come together.
Raise -lower commands are sent from dispatch to individual power plants to adjust their steam throttle valves .
The goal is to keep the system balanced, generation not too far from load, and all lines operating comfortably within limits.

So what we see in control room is the frequency recorder within typically 0.01 hz of 60 and we hear the raise-lower "beeps" comiing from central dispatch office computers to our computers. Mr Turbine obediently adjusts power into generator. Mr Boiler must follow.
Dispatcher calls by radio to request manual voltage adjustments for system reactive balance, or at least still did when i retired ten years ago..IF a 1000 megawatt plant suddenly drops offline, the system indeed slows down as would any overloaded machine. 1/10 hz, from 60 to 59.9 hz is a big slowdown and the individual plant's speed-governors over a substantial area would react quickly to do what they could,, and power from adjacent regions will flow in per laws of ohm and kirchoff.. The system dispatcher would re-balance generation by raising output of all plants in the region quickly as possible.

If frequency drops drastically, relays out in the system shut off feeders to large blocks of customers to match load to generation.

To your question - motors of course draw more or less power according to their speed. That helps a little bit.
The whole grid rotates together and excess generation accelerates it, deficient generation slows it down in accordance with freshman physics..

But the real work is done by dispatchers maintaining that system balance.
Be aware that the electric grid is a huge delicately balanced rotating machine spread across the entire country.

And an interesting place to work. EE Students - don't knock power.

Search on "Power System Stability"

Last edited:
Spinnor

(1) J*(dOmega/dt) = Mel - Mload

J - moment of inertia; omega - angular speed; t - time; Mel - electromagnetic torque; Mload - loading torque

(3) frequency = ( rotating speed * no. of pole pairs) / 60

All well known, right ? How it works then:
1. There is a power drop in the system (Pload goes down)
3. Electromagnetic torque (torque provided by turbine driving the generator) is const. all the time
4. according to (1) we have some torque surplus which affects the speed (omega)
5. increased speed means higher frequency (3)

Last edited:
illini1022 said:
So to rehash this one more time.

Pin=Pout is really what I want to understand. At steady state let's say Pin = 100MW and Pout = 100MW. Now 1MW of load is lost. Examining this transient period of time - Pin is still 100MW because your generator has not had time to react.

Pout is still 100MW because of the conservation of energy we just spoke of (Pin=Pout). So my question really was, where does this extra 1MW of power get dissipated during this transient period? Let's assume there is no energy storage or flywheels.

There are only two places I can see where this extra energy is dissipated. It would either be A) in the mechanical generating turbines themselves, or B) the overall connected load helps compensate, or C) some combination of A and B.

I was thinking - when the frequency increases, the inductive reactance of a connected load increases as well, right? So does the connected grid dissipate slightly more power at 61 Hz as opposed to 60 Hz? Or does all of the extra power get burned up as mechanical energy?

Power in = Power out.

If you lose a chunk of your load then the volts will go up and everyone else's equipment will take more current (whether they want it or not) until the fuel regulation cuts in. In the same way, the volts sag when a large extra load is applied.

This is massive problem for Nuclear Stations, which take hours to respond to a cut in demand. So much so that I've heard of a proposed Pump Storage scheme which could dump excess energy as hydro electric in a nearby lake. This was to avoid problems for a proposed station at the end of just one vulnerable grid line.

sophiecentaur said:
Power in = Power out.

If you lose a chunk of your load then the volts will go up and everyone else's equipment will take more current (whether they want it or not) until the fuel regulation cuts in. In the same way, the volts sag when a large extra load is applied.

What happends when You rapidly switch off all load ? (Generator emergency switch off on breaker). Turbine is still driving generator but power has nowhere to go.

What happends then ? Voltage goes REALLY UP (generator transformers are constructed to withstand such a voltage surge). But rotating speed of generator goes up also! Until turbine is switched off (there are some special mechanical devices designed to do this - they are based on centrifugal force of rotating mass).

What I'm trying to say: this is a complex problem. In transient states like load changes there are frequency AND voltage changes. To observe this we would need to have lab or at least some simulation model. In other cases it's better to asume something (in my previous post I've assumed constant voltage, sophie assumed constant frequency - both are simplifications).

sophiecentaur said:
Power in = Power out.

If you lose a chunk of your load then the volts will go up and everyone else's equipment will take more current (whether they want it or not) until the fuel regulation cuts in. In the same way, the volts sag when a large extra load is applied.

This is massive problem for Nuclear Stations, which take hours to respond to a cut in demand. So much so that I've heard of a proposed Pump Storage scheme which could dump excess energy as hydro electric in a nearby lake. This was to avoid problems for a proposed station at the end of just one vulnerable grid line.

I was under the impression that the voltage fluctuations were more a result of the fundamental voltage drop of current flowing through resistance. (as opposed to the imbalance of demand/generation)

I.E. if your generating voltage (stepped up) is 69 kV, under a heavy load (or overload) you will have more current flowing through the line resistance and thus the receiving end will have a lower voltage (59.8kV). When you lose load, you decrease the current, and receiving end voltage goes back up (69kV).

I think Gerbi is right in that this is just a really complex problem. There are a lot of factors in play here.

I seem to recall something from the information Bob S gave.

In my power transmission courses in school, we dealt with transients occurring in the first few seconds after a fault occurs on the line, which change the frequency up to many times the fundamental for the first little while... I want to say it's due to "armature reaction" in the generator when the fault occurs... I'm sorry I don't rightly recall... I will try to dig out my old notes...

But you should try to research that a little if you can, I think the answer may be there

I can help with the VERY basics.

There are some simple control systems at play and they interact.

Voltage is controlled by an electronic voltage regulator on each generator.
Turbine speed is controlled by a fast, high gain governor that will drive steam valves (hence power being generated) full travel over about a 3% speed change. Some are 5%.

To prevent a system collapse on loss of a generating unit, Load shedding relays in the system disconnect customers when frequency decays to certain frequencies, we had three stages i don't remember exactly but like 59.6, 59.5 and 59.4 sound familiar.

After a short time below ~58 hz turbines start disconnecting themselves to protect plant equipment.

Around 56 hz some nukes trip the reactor because the pumps need to run faster than that to assure cooling flow..

there's a lot going on.
Imagine now, in an overload you need more generation but if frquency falls low enough you trip your big generators possibly cascading into a blackout. Too much power flowing into a region that's deficient in generation wil trip transmission lines...

In case of over-generation, the turbines would run down to zero power at about 61.8 hz for a 3% governor, 63 hz for a 5%.. Around 66 hz the turbine will trip itself offline to protect against centrifugal force.

Your basic answer is what gerbi said, power imbalance accelerates the machine (whole grid) in accordance with its inertia.

It's really that simple. But the interactions get interesting - soo many permutations.

cabraham and rollingstein
Jim, my boy, respect. Ace info.

jim hardy

## 1. Why does the frequency on a power grid increase when generation exceeds load?

The frequency on a power grid is directly related to the balance between generation and load. When generation exceeds load, there is an excess of power being produced, causing the frequency to increase. This is because the generators are producing more energy than is being consumed, leading to a surplus of power on the grid.

## 2. How does the frequency affect the functioning of electrical devices?

The frequency of a power grid is an important factor in the functioning of electrical devices. Most devices are designed to operate at a specific frequency, and if the frequency deviates significantly from this, it can cause malfunctions or damage to the device. That is why maintaining a stable frequency is crucial for the proper functioning of electrical devices.

## 3. Why is it important to maintain a stable frequency on a power grid?

Maintaining a stable frequency on a power grid is essential for the reliable and efficient operation of the grid. Fluctuations in frequency can lead to power outages, damage to equipment, and even safety hazards. To prevent these issues, grid operators closely monitor and control the frequency, ensuring it stays within a specific range.

## 4. What measures are taken to regulate the frequency on a power grid?

Grid operators use various measures to regulate the frequency on a power grid. One common method is to adjust the output of power plants to match the demand of consumers. In addition, there are also specialized devices called frequency regulators that can automatically adjust the output of generators to maintain a stable frequency.

## 5. Can frequency fluctuations on a power grid be harmful to the environment?

Frequency fluctuations on a power grid can have some negative impacts on the environment. For example, if the frequency deviates too much, it can cause damage to electrical equipment, which may lead to the release of hazardous substances. However, by closely monitoring and regulating the frequency, these risks can be minimized, making the grid more environmentally friendly.

• Electrical Engineering
Replies
4
Views
372
• Electrical Engineering
Replies
46
Views
4K
• Electrical Engineering
Replies
12
Views
2K
• Electrical Engineering
Replies
5
Views
2K
• Electrical Engineering
Replies
5
Views
3K
• Electrical Engineering
Replies
4
Views
2K
• Electrical Engineering
Replies
4
Views
1K
• Electrical Engineering
Replies
33
Views
2K
• Electrical Engineering
Replies
2
Views
969
• Electrical Engineering
Replies
8
Views
3K