# Temperature dependence of LED emission bandwidth

• Funtime

## Homework Statement

If an LED is used in an optical communications system, explain what would happen to the temporal spread caused by material dispersion in the fibre as the LED is cooled

## Homework Equations

(having trouble using the latex equation editor, sorry)
I know that the temporal dispersion is proportional to gamma- the spectral purity (delta lambda / lambda), and the dispersion parameter- wavelength squared multiplied by the double differential with respect to wavelength of the fibre's refractive index.

## The Attempt at a Solution

The spectral purity will be the key variable as the dispersion parameter is a function of the fibre alone. I know that as the LED cools, the fermi-dirac state-occupancy probability will more closely approximate a step function, but I'm having difficulty visualising how that will affect the spread of wavelengths emitted. Am I at least along the right lines so far?

The light emitted by the LED is due to electron-hole recombination, right? The energy released in this recombination sets the frequency of the emitted light. So you're on the right track, I think, and you need to think about how the possible recombination energies depend on temperature.

I believe you need to redefine what you are looking for as this is a broad term "Temporal", or simplier for this subject resonance.
but here is what I believe is a intriguing idea for temperature dependence on solid state light transcievers.
( surface area sq. * 1 / Current * load / frequency )
(sorry for 2dimensiality), ending in a celcius range, unless of course you wish to get fermiatic on it.