Increase N in Optical Fiber: Effects on Speed of Light

AI Thread Summary
Increasing the index of refraction (n) in optical fiber results in a larger optical path length, defined as the actual length multiplied by n, which ensures equal optical path lengths contain the same number of wavelengths. As n increases, the speed of light in the medium decreases inversely, leading to slower light propagation. This phenomenon can be explained by the interaction of light with electrons in the medium, where some light is absorbed and re-emitted, altering the phase velocity. The classical electromagnetic wave perspective provides a foundational understanding, while a quantum-mechanical view involves more complex photon interactions. Understanding these principles is essential for analyzing light behavior in different media.
dervast
Messages
132
Reaction score
1
Hi do u know if we increase the n in an optical fiber why the optical distance becomes bigger?

Now only this but why when we increase n the light goes slower and slower... How can we prove that? I know that n=c/u but still i need more explanation
 
Science news on Phys.org
Can u please help me with that?
 
Showing u=-c/n usually takes several pages in an EM physics text.
You will just have to read some physics.
 
dervast said:
Hi do u know if we increase the n in an optical fiber why the optical distance becomes bigger?

(I assume by "optical distance" you mean what English-language physics textbooks call the "optical path length".) It's because we define optical path length to equal the actual length times the index of refraction (n). We define it this way so that equal optical path lengths always contain equal numbers of wavelengths for the same frequency. This makes it easier to analyze interference in situations where light passes through different media with different n's.

The wavelength of light in a medium varies directly with n. The frequency stays the same while the speed varies inversely with n, which leads to your next question...

Now only this but why when we increase n the light goes slower and slower... How can we prove that?

You have to analyze carefully the way the light interacts with electrons in the medium. Very broadly speaking, some of the incoming light is absorbed by electrons in the medium. The electrons radiate new light. The remaining incoming light and the new radiated light interfere in such a way that the resultant wave in the medium has a different phase velocity from the incoming light.

The above is the classical picture, in terms of classical electromagnetic waves. If you want a quantum-mechanical picture using photons, I'll let someone else try. I don't know enough quantum electrodynamics.
 
Last edited:
Back
Top