# Amplitude change of light during refraction

## Main Question or Discussion Point

A Monochromatic source of light in a vaccuum shines light into a block of glass. How does the amplitude of the light wave change once it is inside the glass and once it leaves the glass. I know that some energy (and hence amplitude) is lost by the light wave when it enters the glass due to slight reflection. But is that quantity of energy even measurable? Since I am only interested in the theoretical aspect of this problem, would that mean that the energy and amplitude of the wave are assumed to remain constant throughout? If not, how do they decrease once inside the glass?

Related Other Physics Topics News on Phys.org
jtbell
Mentor
See Fresnel's equations. They can be derived from the electromagnetic wave solutions to Maxwell's equations and the boundary conditions on the electric and magnetic fields at the surface of the medium.

For glass with an index of refraction n=1.50, the theoretical reflection losses at the entrance surface are about 4% for normal incidence, and the same at the exit surface. If your entrance and exit angles are less than 90 degrees, the reflection losses are more (and can be 100% for polarized light at Brewsters angle). These reflection losses are easily observable (and measureable): take a laser light pen and shine it through a glass window and observe the reflection. If the entrance angle is less than 90 degrees you can probably see both the entrance and exit reflections. The attenuation loss in the glass can theoretically be zero, but only if the glass is dispersionless. If there is dispersion in the glass, then there has to be attenuation. It is required by the Kramers-Kronig relations.

In this case, the light waves are striking the glass at its normal.
So you're saying that 4% of the total energy is lost when entering the glass. Then the amplitude remains the same throughout the glass. And finally, when leaving the glass, another 4% energy is lost? Can you please explain why this happens?I dont really see see why another 4% would be lost when exiting the glass.

A light wave is composed of a transverse E field and a transverse H field. The direction of the light wave is along the cross product vector E x H. For normal incidence, both E and H are parallel to the surface. Both E and H are continuous across the boundary, but because glass has an index of refraction n, then the relative dielectric constant e is equal to n2. So the electric field inside the glass is E/e. In order to conserve both the power of the light wave, which is proportional to the product E x H, and maintain the correct ratio of E over H in vacuum (377 ohms)**, and in the glass (377/n ohms), there has to be a reflection at the surface, which is (n-1)2/(n+1)2 = 4% for glass. It is the same when the light ray exits the glass. A comprehensive discussion can be found in Jackson "Classical Electrodynamics" (2nd Ed) Sect 7.3 or Slater and Frank "Electromagnetism" Chapter X sec 3.

** Ohms is the unit for the ratio volts/amps. Becuuse E is in volts per meter, and H in amp-turns per meter, E/H also has the units of ohms. In a coaxial cable like RG-58, E/H = 50 ohms.