If an LED is used in an optical communications system, explain what would happen to the temporal spread caused by material dispersion in the fibre as the LED is cooled
(having trouble using the latex equation editor, sorry)
I know that the temporal dispersion is proportional to gamma- the spectral purity (delta lambda / lambda), and the dispersion parameter- wavelength squared multiplied by the double differential with respect to wavelength of the fibre's refractive index.
The Attempt at a Solution
The spectral purity will be the key variable as the dispersion parameter is a function of the fibre alone. I know that as the LED cools, the fermi-dirac state-occupancy probability will more closely approximate a step function, but I'm having difficulty visualising how that will affect the spread of wavelengths emitted. Am I at least along the right lines so far?