The fact that light has a smaller apparent speed in a transparent medium can be explained classically by considering the motion of the electrons of the medium in the oscillating (radiation) electric field produced by the source. Because they accelerate, these electrons emit an electric field that is 90 degrees out of phase (lagging) with the field produced by the source at the detection point. The total field at the detection point could then be written as a superposition of a cosine wave (source) and a sine wave (medium) having different amplitudes (the medium wave having a smaller amplitude). As expected, the superposition wave is out of phase (lagging) the cosine wave from the source, explaining the smaller apparent speed of light in the medium. However, the amplitude of this superposition wave is GREATER than the amplitude of the cosine wave, which does not sound right. Putting a transparent medium between the source and the detection point obviously cannot increase the amplitude, right?