# Time Dilation and Gravity's Effect on Light

## Main Question or Discussion Point

I've been reading about how gravity affects light and time. However, I'm a little confused about the details. Does light lose energy and decrease in frequency the further away it gets from Earth? If so, this means that the frequency of light (the number of light waves per second) increases the closer it is to Earth and other celestial bodies. However, why is it that when scientists conducted the water tower experiment in 1962 by putting clocks at the top and bottom of the tower they found out that the clock closer to Earth ran slower? I would think that if the frequency of light increased as it got closer to Earth, then clocks would run faster in comparison to another clock that was further away from Earth. Please help clarify this for me. Thanks!!!

Related Special and General Relativity News on Phys.org
Dale
Mentor
If clock A is transmitting a 1 MHz signal and clock B measures the signal to be 2 MHz then clock B is running slower than clock A. Does that help?

Not really. I have two main questions:

1. Does the frequency of light increase as it gets closer towards Earth or other large celestial bodies?

2. If this is the case, wouldnt a clock go faster than a clock that is further away from the Earth?

Your link is very helpful, thank you. However why does the watch closer to the earth appear to move slower when in fact the frequency of light waves increases as they get closer to Earth? I would think that this causes the watch closer to Earth to appear to move faster than the one farther away from Earth.

George Jones
Staff Emeritus
Gold Member
Your link is very helpful, thank you. However why does the watch closer to the earth appear to move slower when in fact the frequency of light waves increases as they get closer to Earth? I would think that this causes the watch closer to Earth to appear to move faster than the one farther away from Earth.
Let clock A be hovering above hovering clock B. In order for an image of B to be formed at A, light has to travel from B to A. During this upwards travel, the frequency of the light (relative to hovering observers) decreases.

Let clock A be hovering above hovering clock B. In order for an image of B to be formed at A, light has to travel from B to A. During this upwards travel, the frequency of the light (relative to hovering observers) decreases.

I don't think that frequency of light changes during its upward travel.

Suppose you have a machine gun and you are shooting straight up on "automatic" at the rate of 10 bullets/second. Your bullets will hit the target above you at the same rate of 10 bullets/second no matter how high the target is (providing the bullets have enough initial velocity to hit the target).

The same with light rays. The frequency must stay the same during their travel through gravitational field. Otherwise you'll need to explain how the wave crests can appear (or disappear) out of nothing.

Of course, different observers (at different elevations) may disagree about the numerical value of this frequency, because their clocks run at different rates.

DrGreg
Gold Member
The frequency must stay the same during their travel through gravitational field.
Of course, different observers (at different elevations) may disagree about the numerical value of this frequency...
Those two sentences contradict each other. Either the frequency is the same, or it isn't. The only way to measure a frequency is by an observer. Different observers measure different frequencies, so saying "the frequency must stay the same" makes no sense.

diazona
Homework Helper
Here's the deal, as I understand it: let's say you have two clocks, A at a high elevation and B at a low elevation. In order to compare the rates of the two clocks, you need to pick a single reference point in space and put an "observer" at that position. But the information about the ticks of the clocks needs to be carried to the observer's position somehow - in practice, it's a light signal that does the job. As the light propagates from the clock to the observer, it loses or gains energy due to the change in gravitational potential between the clock and the observer, which causes its frequency to change. (Light propagating away from Earth loses energy and drops in frequency)

In a typical situation, clock B is at sea level, clock A is on top of Mount Everest, and the observer is way out in space. Suppose each clock ticks at 2MHz with respect to its own timeline (so if you were standing right next to clock B, you'd measure clock B at 2MHz, and if you were standing right next to clock A, you'd measure clock A at 2MHz). The light signals that clock B radiates to indicate its ticks, by the time they get to the observer, will have dropped in frequency to, let's say, 1MHz (totally not a realistic number). But the light signals that clock A radiates to indicate its ticks would not have lost as much energy, since they're traversing a smaller potential difference - they might reach the observer at 1.2MHz. So the observer in space would see clock A running faster than clock B.

Either the frequency is the same, or it isn't. The only way to measure a frequency is by an observer. Different observers measure different frequencies, so saying "the frequency must stay the same" makes no sense.

Any observer can measure frequency of the light ray as it travels through different points in space (at different elevations). Each observer would see that the frequency of light does not depend on elevation. However two different observers may disagree about the numerical value of this frequency, because their (atomic) clocks run at different rates.

(Light propagating away from Earth loses energy and drops in frequency)
Then you get the same contradiction as noticed by OP: Light oscillations become slower at higher elevations, but clock rates become faster at higher elevations. How could that be?

How about this? A clock on the ground is used to generate a light wave at a frequency related to the the local ground clock rate or period...i.e. frequency = 1 / period. The wave travels upwards to orbit and becomes lower in frequency. An observer in orbit measures this lower frequency against his local clock and observes that the ground clock has lower frequency and thus is operating slowly compared to his local clock which is designed to generate the same frequency. So he concludes the ground clock is running slow and the ground crew concludes that the spaceman's clock is running fast.

Suppose you have a machine gun and you are shooting straight up on "automatic" at the rate of 10 bullets/second. Your bullets will hit the target above you at the same rate of 10 bullets/second no matter how high the target is (providing the bullets have enough initial velocity to hit the target).
This is not quite right. If the gravitational time dilation factor is 10 times greater lower down then when the machine gun is fired for one full second at a rate of 10 bullets per second (as measured by the watch of the gunman at the bottom, then the bullets will arrive at a rate of one bullet per second at the top, over a total period of 10 seconds as measured by the watch of the observer at the top.

Somtimes the observed redshift of photon frequency higher up is explained as a stretching of the wavelength of the photon (bigger gaps between the crests) as it rises and while increased wavelength is true it is not the whole story because the slowdown of the rate of the arrival of the bullets at the top can not be explained by an increase in wavelength. The fact is, clocks lower down really do run slower relative to clocks higher up in a very physical sense. It is also true that if identical twins were separated at birth and one placed low down in the gravitational field and one higher up, then when they get back together again, one could be a young boy that is 10 years old and the other one could be an old man that is 100 years old with grey hair and all that. See page 118 of this Google book http://books.google.com/books?id=0J_dwCmQThgC&printsec=frontcover#PPA118,M1

The same with light rays. The frequency must stay the same during their travel through gravitational field. Otherwise you'll need to explain how the wave crests can appear (or disappear) out of nothing.
Note in the machine gun example (nice example by the way) that 10 bullets left the bottom and 10 bullets arrived at the top, so no missing bullets and by analogy, no additional or missing wave crests.

Last edited:
Then you get the same contradiction as noticed by OP: Light oscillations become slower at higher elevations, but clock rates become faster at higher elevations. How could that be?
Here is an experiment you can do. Get 2 people without a watch or clock to measure the drip rate of a leaking tap (frequency) using some rough method such as their heart pulse to estimate the drip rate. The chances are that they might come up with 2 different drip rates even though they are measuring the same thing. The speed of the clock used to make the measurement affects the measurement. The faster the clock used to make the measurement, the slower the apparent frequency of the phenomena being measured. In relativity, you can not always trust the measurements that are made by clocks because speed and gravity alter the rate clocks run at.

Dale
Mentor
1. Does the frequency of light increase as it gets closer towards Earth or other large celestial bodies?
Yes.

2. If this is the case, wouldnt a clock go faster than a clock that is further away from the Earth?
No, this is backwards.

Consider two clocks that are each shipped from the factory as 10 MHz clocks (10 million ticks per second). Clock A ships intact but clock B is dropped in shipment and now runs slow at 5 MHz (only 5 million ticks per second). Clock A is used as a modulator to broadcast a 10 MHz signal and clock B is used as a demodulator to recieve it. So, what frequency does clock B measure? In the time that B ticks 5 million times it recieves 10 million cycles of the broadcast from A, so B (thinking that 5 million ticks is half a second) calls the recieved frequency 20 MHz. So the clock that ticks slow is the one that measures a signal as being higher (blueshifted).

diazona
Homework Helper
Then you get the same contradiction as noticed by OP: Light oscillations become slower at higher elevations, but clock rates become faster at higher elevations. How could that be?
As others have said, there is no contradiction.

At higher elevations, if the light oscillations are slower (than the clock rates), then the clock rates are faster (than the light oscillations).

This is not quite right. If the gravitational time dilation factor is 10 times greater lower down then when the machine gun is fired for one full second at a rate of 10 bullets per second (as measured by the watch of the gunman at the bottom, then the bullets will arrive at a rate of one bullet per second at the top, over a total period of 10 seconds as measured by the watch of the observer at the top.
I think we fully agree here. Let me summarize what different observers see:

Observer at the bottom: 10 bullets are fired per second and they hit the target 10 times per second.

Observer at the top: 1 bullet is fired per second and they hit the target 1 time per second.

Observer very far away (where the gravity field is absent): 1 bullet is fired each 10 seconds and they hit the target a the rate 0.1/second.

So, we conclude that each observer agrees that the frequency of bullets does not change in the course of their travel through the gravitational field. However, observers disagree about the numerical value of this frequency. This is because each observer uses its own clock to measure time or frequency. The clock rate depends on the gravitational potential. "One second" at the bottom (large negative gravitational potential) lasts 100 times longer than "one second" measured by the clock in space.

Another conclusion is that the gravitational red shift or blue shift have nothing to do with the loss (or gain) of kinetic energy by photons propagating in the field. These shifts are fully explained by the fact that energy levels of atoms move closer to each other in strong negative gravitational potentials (at the bottom). This has two effects: First, photons emitted by atoms at the bottom have lower energy and lower frequency than photon emitted by the same atoms in free space. Second, clocks at the bottom have lower rate, because according to quantum mechanics the rate of any time-dependent process is proportional to the separation between stationary energy levels.

George Jones
Staff Emeritus
Gold Member
I don't think that frequency of light changes during its upward travel.
I don't know what you mean by this. As I made clear in my previous post, I mean that different observers measure different frequencies.
The frequency must stay the same during their travel through gravitational field. Otherwise you'll need to explain how the wave crests can appear (or disappear) out of nothing.
What is you operational definition of frequency?
Of course, different observers (at different elevations) may disagree about the numerical value of this frequency, because their clocks run at different rates.
Yes, we agree on actual measurements, but we disagree on interpretation. I prefer not to talk about comparison of clock rates unless the clocks are at the same event. See

Also, common statements about comparisons of clocks at different events involve inconsistent interpretations.

"Clocks at different heights in a gravitational field run a different rates" is interpreted as a visual effect as in the above link.

"Moving clocks run slow" refers to the difference in elapsed coordinate times, not a visual effect. In terms of visual effects, moving clocks can run fast or slow; see

Consequently, I prefer not to compare the rates of clocks that are at different events.

I prefer not to talk about comparison of clock rates unless the clocks are at the same event. See

Also, common statements about comparisons of clocks at different events involve inconsistent interpretations.

"Clocks at different heights in a gravitational field run a different rates" is interpreted as a visual effect as in the above link.
Let's say George and Franklin are twins. They both have an identical mass of radioactive material for experimental purposes and accurate synchronised clocks. When they are 10 years old Franklin descends deep into a gravitational well. George watches Franklin's clock from above with a telescope for 80 years and sees that only 5 years appears to have passed on Franklin's clock. Now George descends to meet his brother and when they get together they find Franklin is 15 years old physically and George is 90 years old physically and the physical differences are very obvious. Franklin is a fit teenager and George is an old man with grey hair and all that. They compare the experimental radioactive masses and find that George's has decayed significantly more, consistent with difference in elapsed time that has passed on their respective clocks. Would George still claim gravitational time dilation is purely a "visual effect"? (or would he claim that the twins would not have aged differently when they get together again?)

It might be worth observing that when 2 twins are going away from each other in the classic twins paradox of SR, that both observe the clocks of the other to be going slower than their own and it is meaningless to claim one or other really is ageing slower until one of them turns around. In the GR example it is different. The lower twin sees the higher twins clock run faster and the higher twin sees the lower twins clock run slower. There is no ambiguity in the GR case unlike the SR case.

I don't know what you mean by this. As I made clear in my previous post, I mean that different observers measure different frequencies.

Yes, we agree on actual measurements, but we disagree on interpretation. I prefer not to talk about comparison of clock rates unless the clocks are at the same event.

The original question was whether or not the light frequency changes while the light pulse travels in the gravitational field? As I understand, you prefer to define the light frequency based on the rate of the clock, which is next to the pulse at a given time instant. So, basically, you have a bunch of observers placed along the light pulse's path, and you define "frequency" as "oscillations per second", where the second is measured by the clock which is closest to the pulse. If this is your definition, then I fully agree that thus defined light frequency will vary. This variation is explained completely by the fact that clocks at different elevations tick at different rate. So, the meaning of the "second" is different at different elevations.

My approach is different. I think that the frequency of the light pulse (or any other observable) should be always defined with respect to one observer. Of course, with time the pulse will move farther and farther away from this observer, but he still should able to measure the frequency of the pulse from the distance (how to do that technically, that's another question) and relate this frequency to his own clock. My claim is that from the point of view of this single observer the frequency of the light pulse does not change in time (the frequency does not depend on the pulse's elevation).

Other observers (stationed at different elevations) can also observe the same light pulse during its travel in the gravity field. All these observers would agree that the frequency does not change with time. However, they would disagree about the numerical value of this frequency, simply because all these observers have different definitions of the "second".