# Refractive Phase Difference

I'm wondering about small changes of phase due to a refractive medium. For example, suppose there is an emitter of radio or light waves, and two detectors equidistant from the emitter. One of the detectors is behind a medium, while the other is a straight shot through a vacuum.

How would the difference in the detectors evolve over time? Would it just be a constant phase shift, or would the angular frequency of the refracted wave change?

UltrafastPED
Gold Member
Draw a diagram ... if one wave is subject to a refractive medium it will have a change of wavelength in that medium because v=c/n, but the frequency will not change.

The one through vacuum has n=1, so v=c and there is no refraction.

Hmm, perhaps my question was not explained well. The detector behind the medium is not in the medium. I know that the wavelength and the velocity will change, but I'm not sure whether the detected phase change will by proportional to time or constant. My confusion is coming from two perspectives:

1. As an analogy, the speed of the convoy is the slowest ship. Since the speed of light is slowed through the medium, all of the light going through the medium should travel at a slower rate. Let's say that that in the vacuum, n = 2. Then c / 2 is proportional to the number of photons (regardless of frequency in the spectrum?) hitting the detector. It seems like the detector behind the medium will detect photons half as as fast, which in turn seems to imply that somehow the frequency has changed?

Number of photons = c * k1 * t
Number of photons = c / 2 * k1 * t
where k1 is some proportion relating speed of light and number of photons. As time goes on, the difference in photons detected at each receiver will increase.

2. Since the frequency is constant and determined by the emitter, the frequency should not change. Thus, there should be a constant difference of photons detected.

Number of photons = c * k1 * t
Number of photons = c * k1 * (t - delta)
where delta is the constant shift.

EDIT: Additionally, I'm not sure how to think about it because light can be thought of as a particle or a wave. Does the particle model apply to radio waves as well?

UltrafastPED
Gold Member
The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length

morrobay
Gold Member
The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length

What is the description of the mechanism of how the wavelength changes in the medium ?
The geometry was first shown by Huygen : λ2 = λ v/c
I understand that the frequency has to be the same because of the boundary conditions.
So the apparent slowing of light transmission in refraction is because of a shorter wavelength .Im not looking for a geometric optics explanation. Rather the exact physical mechanism at the micro scale in terms of the interaction of the time varying electric field of the EM wave with the electrons in the medium.

Last edited:
sophiecentaur