# A few questions about frequency of light

#### sokol8

Does the frequency of the radiowave change as it is transmitted or it is only its amplitude that can change/decrease causing decline in the signal strength? For me that would make more sense, because you can listen to FM afterall if the same frequency is received...Or maybe f decreases and the receiver has some tolerance about it?

How is it in Dopler effect for light, let's take for example a police radar. When they send a singal and then receive it back there is a time dely because light of the same freuency and (therfore wavelenght) has to travel longer from a speeding car. Why do they say that frequency and wavelenght change instead that the time of the travel is different. So the police radar detects a time delay right?

It is really confusing to me why some peapole say that wavelenght change, while frequency does not, how that can be since they are linked in one the same equation for c = wavelength x frequency?

In my understanding the frequency (and wavelength at the same time) can change only when we consider a doppler effect measured for stars, where the gravitation bends light and therfore f and w change.

Basically I think of a frequency (and wavelengh) as a absolute carrier of the infomration and the change in them change the information, so they must be constant, while the time can change and the amplitude (singal strenght) can change. Alternatively quantum effect come into play that on astronomic scale (mentioned red shift for stars) capable of altering w anf f...

OK, so you know my point of view, can you specialist please confirm what is right and what is not?

Thanks

Related Introductory Physics Homework Help News on Phys.org

#### ehild

Homework Helper
Does the frequency of the radiowave change as it is transmitted or it is only its amplitude that can change/decrease causing decline in the signal strength? For me that would make more sense, because you can listen to FM afterall if the same frequency is received...Or maybe f decreases and the receiver has some tolerance about it?
A source transmits waves with the frequency of the signal it produces. The intensity of the wave decreases with distance as it spreads out.

How is it in Dopler effect for light, let's take for example a police radar. When they send a signal and then receive it back there is a time delay because light of the same frequency and (therefore wavelenght) has to travel longer from a speeding car. Why do they say that frequency and wavelenght change instead that the time of the travel is different. So the police radar detects a time delay right?
The police radar observes both time delay and frequency shift. The chased car would detect a slightly different frequency than that transmitted by the police car, because of time dilation as it's frame of reference moves with respect to the police car. The car reflects the radar signal with the observed frequency, but the police car would observe it with a shift again. The change of frequency is used to find out the speed of the car. The frequency shift depends on the ratio of the relative velocity between the cars to speed of light. See http://physics.about.com/od/lightoptics/a/doplight.htm

It is really confusing to me why some people say that wavelenght change, while frequency does not, how that can be since they are linked in one the same equation for c = wavelength x frequency?
Both change, the speed of light stays constant.

In my understanding the frequency (and wavelength at the same time) can change only when we consider a doppler effect measured for stars, where the gravitation bends light and therfore f and w change.
Bending the light or changing frequency by gravitation (gravitational redshift http://en.wikipedia.org/wiki/Gravitational_redshift) are other effects, discussed by General Relativity. The Doppler effect changes the observed frequency of electromagnetic waves when it is emitted by a source moving with respect to the observer. It is treated by Special Relativity Theory.

Basically I think of a frequency (and wavelengh) as a absolute carrier of the infomration and the change in them change the information, so they must be constant, while the time can change and the amplitude (singal strenght) can change. Alternatively quantum effect come into play that on astronomic scale (mentioned red shift for stars) capable of altering w anf f...

OK, so you know my point of view, can you specialist please confirm what is right and what is not?

Thanks
The frequency of the emitted line is determined by the source, but an observer detects a different frequency because of the relativistic effects, just like you measure different length for a metre stick and time period of a clock when they move with respect to you.
The red shift and blue shift of light coming from remote stars or galaxies indicate if the object moves towards the Earth or away from it.

ehild

#### sokol8

Thank you again for very useful answers.
I have got some more questions to ask tough, if I may...
In radio, if frequency of the trasmiter and the receiver is the same, how they can communicate if the frequency changes on the way? Surely they must be tuned slightly different to account for this, are they?

As I understood you correctly, frequency between the emitter and receiver will alwyas change even if they stand still (because e.g. Earth has slightly different rotational speed at their positions). And the higher the relative speed between both the higer the frequency shift, is that right?

Thanks agian

#### ehild

Homework Helper
I am not an expert on the field, so what I write might be not quite correct.
The frequency shift depends on the relative velocity v divided by c. Even if a car travels by 100 km/h, v/c is in the range of 10-7! This would mean frequency shift of a few Hz of a 100 MHz wave.The bandwidth of normal FM radio senders and receivers is much broader than that, (about 70 kHz).

At the same time, the police cars can measure frequency difference between the emitted and received waves very accurately by an appropriate equipment. See: http://hyperphysics.phy-astr.gsu.edu/hbase/sound/radar.html

As about speeds at different points of Earth, you can estimate it: The average radius of the Earth is about 6370 km, so a point of the Earth surface moves with 463 m/s speed. At 10 degrees latitude, which is at more than 1000 km distance, a point rotates along a circle of radius 6273 km, with 456 m/s speed. The relative speed is 7 m/s! So v/c is very-very small.

ehild

#### sokol8

Thank you again for explanations!

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving