Understanding Refractive Phase Differences in Electromagnetic Waves

AI Thread Summary
Small changes in phase due to a refractive medium affect the propagation of electromagnetic waves, with the speed of light slowing down in the medium while the frequency remains constant. Detectors positioned equidistant from an emitter will experience different photon detection rates, with the one behind the medium detecting fewer photons over time due to the reduced speed of light in that medium. The concept of optical path length can help clarify this difference, as it accounts for the change in wavelength while maintaining constant frequency. The interaction of the electromagnetic wave with electrons in the medium leads to a delay in wave propagation, as the electrons oscillate and re-radiate the wave. Understanding these principles is crucial for analyzing the effects of refraction on electromagnetic wave behavior.
ajdecker1022
Messages
10
Reaction score
0
I'm wondering about small changes of phase due to a refractive medium. For example, suppose there is an emitter of radio or light waves, and two detectors equidistant from the emitter. One of the detectors is behind a medium, while the other is a straight shot through a vacuum.

How would the difference in the detectors evolve over time? Would it just be a constant phase shift, or would the angular frequency of the refracted wave change?
How can I think about this properly?

Thanks in advance.
 
Science news on Phys.org
Draw a diagram ... if one wave is subject to a refractive medium it will have a change of wavelength in that medium because v=c/n, but the frequency will not change.

The one through vacuum has n=1, so v=c and there is no refraction.
 
Hmm, perhaps my question was not explained well. The detector behind the medium is not in the medium. I know that the wavelength and the velocity will change, but I'm not sure whether the detected phase change will by proportional to time or constant. My confusion is coming from two perspectives:

1. As an analogy, the speed of the convoy is the slowest ship. Since the speed of light is slowed through the medium, all of the light going through the medium should travel at a slower rate. Let's say that that in the vacuum, n = 2. Then c / 2 is proportional to the number of photons (regardless of frequency in the spectrum?) hitting the detector. It seems like the detector behind the medium will detect photons half as as fast, which in turn seems to imply that somehow the frequency has changed?

Number of photons = c * k1 * t
Number of photons = c / 2 * k1 * t
where k1 is some proportion relating speed of light and number of photons. As time goes on, the difference in photons detected at each receiver will increase.

2. Since the frequency is constant and determined by the emitter, the frequency should not change. Thus, there should be a constant difference of photons detected.

Number of photons = c * k1 * t
Number of photons = c * k1 * (t - delta)
where delta is the constant shift.

Any advice is appreciated.

EDIT: Additionally, I'm not sure how to think about it because light can be thought of as a particle or a wave. Does the particle model apply to radio waves as well?
 
The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length

This should provide your answer.
 
UltrafastPED said:
The frequency never changes while passing through a medium - it is always the wavelength.

Radio and light are both electromagnetic waves, so the theory is the same, but have many practical differences.
One is that the index of refraction barely exists for radio waves because almost everything is through air ... hence n=1.

But using light you can use the "optical path length" technique: http://en.wikipedia.org/wiki/Optical_path_length

This should provide your answer.

What is the description of the mechanism of how the wavelength changes in the medium ?
The geometry was first shown by Huygen : λ2 = λ v/c
I understand that the frequency has to be the same because of the boundary conditions.
So the apparent slowing of light transmission in refraction is because of a shorter wavelength .Im not looking for a geometric optics explanation. Rather the exact physical mechanism at the micro scale in terms of the interaction of the time varying electric field of the EM wave with the electrons in the medium.
 
Last edited:
Would as 'classical' explanation satisfy you? You could think in terms of a distribution of charges with mass, loading the wave in its progress and delaying the perturbations as the wave progresses through the medium as they are forced to oscillate by the impressed varying EM fields and then re radiate a bit later.
 
Thread 'A quartet of epi-illumination methods'
Well, it took almost 20 years (!!!), but I finally obtained a set of epi-phase microscope objectives (Zeiss). The principles of epi-phase contrast is nearly identical to transillumination phase contrast, but the phase ring is a 1/8 wave retarder rather than a 1/4 wave retarder (because with epi-illumination, the light passes through the ring twice). This method was popular only for a very short period of time before epi-DIC (differential interference contrast) became widely available. So...
I am currently undertaking a research internship where I am modelling the heating of silicon wafers with a 515 nm femtosecond laser. In order to increase the absorption of the laser into the oxide layer on top of the wafer it was suggested we use gold nanoparticles. I was tasked with modelling the optical properties of a 5nm gold nanoparticle, in particular the absorption cross section, using COMSOL Multiphysics. My model seems to be getting correct values for the absorption coefficient and...
Back
Top