# Refractive Phase Difference

• ajdecker1022
The geometry was first shown by Huygen : λ2 = λ v/c The frequency has to be the same because of the boundary conditions.Huygen's equation states that the wavelength changes because the speed of light in a refractive medium is slower than in a vacuum.f

#### ajdecker1022

I'm wondering about small changes of phase due to a refractive medium. For example, suppose there is an emitter of radio or light waves, and two detectors equidistant from the emitter. One of the detectors is behind a medium, while the other is a straight shot through a vacuum.

How would the difference in the detectors evolve over time? Would it just be a constant phase shift, or would the angular frequency of the refracted wave change?

Draw a diagram ... if one wave is subject to a refractive medium it will have a change of wavelength in that medium because v=c/n, but the frequency will not change.

The one through vacuum has n=1, so v=c and there is no refraction.

Hmm, perhaps my question was not explained well. The detector behind the medium is not in the medium. I know that the wavelength and the velocity will change, but I'm not sure whether the detected phase change will by proportional to time or constant. My confusion is coming from two perspectives:

1. As an analogy, the speed of the convoy is the slowest ship. Since the speed of light is slowed through the medium, all of the light going through the medium should travel at a slower rate. Let's say that that in the vacuum, n = 2. Then c / 2 is proportional to the number of photons (regardless of frequency in the spectrum?) hitting the detector. It seems like the detector behind the medium will detect photons half as as fast, which in turn seems to imply that somehow the frequency has changed?

Number of photons = c * k1 * t
Number of photons = c / 2 * k1 * t
where k1 is some proportion relating speed of light and number of photons. As time goes on, the difference in photons detected at each receiver will increase.

2. Since the frequency is constant and determined by the emitter, the frequency should not change. Thus, there should be a constant difference of photons detected.

Number of photons = c * k1 * t
Number of photons = c * k1 * (t - delta)
where delta is the constant shift.

EDIT: Additionally, I'm not sure how to think about it because light can be thought of as a particle or a wave. Does the particle model apply to radio waves as well?

The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length

The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length