What is the ideal wavelength for propagation in optic fibres?

  • Thread starter Thread starter Neural
  • Start date Start date
  • Tags Tags
    Optic Propagation
AI Thread Summary
The ideal wavelength for propagation in optical fibers is primarily determined by minimal light absorption losses and maximum transparency of the fiber material. Most fibers exhibit minimal loss at around 1300 nm, while optimal operational wavelengths are typically around 1500 nm due to reduced dispersion. Dispersion affects the integrity of short light pulses, which are crucial for high data transmission rates. Monochromatic light, while theoretically infinitely long, can cause issues in short pulses due to the presence of multiple wavelengths. Minimizing dispersion is essential for maintaining the quality of the transmitted signal.
Neural
Messages
3
Reaction score
0
I was asked this question

Why is there an ideal wavelength for propagation in optic fibres?

a. Light absorption losses are minimum
b. The fibre has its greatest transparency at the ideal wavelength
c. Rayleigh scattering is greatest at this wavelength
d. Total internal reflection occurs only at this wavelength
e. The number of modes is at a minimum only at this wavelength

with these possible answers anyone know what ones they are I can't find out.

thanx
 
Science news on Phys.org
I really don't know why anyone would want one wavelength at all. Wavelength division multiplexing rocks.
 
maybe I should have worded it differently

Why is there an ideal wavelength for propagation in optic fibres?

how about "Out of these possible answers which ones give the ideal wavelength for propagation in optic fibres?"
 
I would guess "a", since light absorption is based on wavelength. So there would be a wavelength at which absorption losses are minimum for the given material.
 
its ok I found them its a and b thanks anyways:wink:
 
Actually, the biggest reason is minimal dispersion. Most fibers have their minimal loss at 1300 nm. Operation is usually at 1500 nm though because the dispersion is minimum there. This means very short pulses will lose their integrity more slowly.

Njorl
 
Originally posted by Njorl
Actually, the biggest reason is minimal dispersion. Most fibers have their minimal loss at 1300 nm. Operation is usually at 1500 nm though because the dispersion is minimum there. This means very short pulses will lose their integrity more slowly.

Njorl

I'm not getting something here. Could you tell me what is dispersing if the light is monochromatic? OR do you mean that for wavelengths near 1500 nm (say 1490 to 1510 nm) there is little dispersion?
 
The signals are made of short pulses. The shorter you can make a pulse, the more info you can send. But when you start getting to very short pulses, the true nature of the lightwave starts to cause problems.

A monochromatic lightwave must be infinitely long. This may surprise you. If it is finite, it is actually the superposition of many waves. If it is many wavelengths long, then almost all the signal is in a frequency that looks like a simple monochromatic source. If it is a short pulse, a significant amount of energy will be in a noticeably different wavelengths (though still pretty close to the fundamental frequency). If you want to keep the integrity of the pulsetrain, all of these different wavelengths should travel at the same speed. In other words, dispersion should be minimized.

Njorl
 
This is intriguing. I'm going to look at this closer. Thanks.
 
Back
Top