How does coherence length of light depend on wavelength bandwidth?

AI Thread Summary
The coherence length of light is influenced by its wavelength bandwidth, with narrower bandwidths leading to longer coherence lengths. This is because broader optical bandwidths introduce more frequency components, causing wave phases to become misaligned more quickly, resulting in a loss of coherence. Coherence is fundamentally a measure of how predictably the phase of light can be determined over time and space, with single-frequency light allowing for infinite predictability. In optical coherence tomography (OCT), using monochromatic light can yield interference over longer distances, but for high-resolution imaging, less monochromatic light is preferred to enhance visibility of interference patterns. The relationship between coherence time and frequency bandwidth is mathematically expressed, indicating that narrower spectral ranges correspond to longer coherence times.
narra
Messages
37
Reaction score
0
Hi.

What physically gives light its coherence length and why does it increase for narrow bandwidths? My thoughts are that the broader the optical bandwidth the more frequency components and thus wave crests (and troughs) of each will spread apart more rapidly than quasi-monochromatic light.Once the frequency components are significantly mis-phased then we say that coherence is lost?

Can we strictly say that light is coherent so long as its phase does not alter?

Thanks
 
Science news on Phys.org
"coherence" of a field is a measure of how well you can predict the value(s) of the field at future times (and other places).

If a single frequency is present, you can predict the value arbitrarily far into the future (infinite coherence time). If instead the field has a range of frequencies present, the ability to predict future values is less than certain (finite coherence time) due to the statistical nature of the field.

The phase of the field is *always* altering- what matters is how accurately you can predict the value.

Does this help?
 
Thanks for your reply. so your are saying that coherence is basically a measure of how far (in time or space) we can can predict the magnitude of a wave, when we can no longer accurately do this then the coherence length has been exceeded? What I still don't quite understand is what physically is enabling this process to occur?

Arran
 
narra said:
Thanks for your reply. so your are saying that coherence is basically a measure of how far (in time or space) we can can predict the magnitude of a wave, when we can no longer accurately do this then the coherence length has been exceeded?

Yes- you got it exactly.

narra said:
What I still don't quite understand is what physically is enabling this process to occur?

Arran

I'm not sure how to answer this- coherence is a mathematical contrivance. Personally, I would appeal to the idea of modeling an uncertain process via stochastic equations and leave it at that.
 
Thanks again Andy. Haha, my problem is that I'm not a mathematician and don't really feel comfortable with a theory until I have it explained in physical terms inside my head. I can accept a process being randomly uncertain but which physical process leads to the coherence of light becoming randomly uncertain at a prescribed time/distance? Does it relate to the random uncertainty associated with the atomic processes resulting in the emission of light or is coherence derived from wave theory? I need the physics before I can be truly comfortable with the idea :)
 
I guess it arises simply because there is more than 1 atom emitting the light.
 
Hello. In the wikipedia page of Optical Coherence Tomography it is mentioned: "Light in an OCT system is broken into two arms—a sample arm (containing the item of interest) and a reference arm (usually a mirror). The combination of reflected light from the sample arm and reference light from the reference arm gives rise to an interference pattern, but only if light from both arms have traveled the "same" optical distance ("same" meaning a difference of less than a coherence length)." Can somebody please explain me why this is the case? What would be the problem if the light was monochromatic? What would stop an interference to occur?
 
marco_polo said:
Can somebody please explain me why this is the case? What would be the problem if the light was monochromatic? What would stop an interference to occur?

Nothing would stop interference from occurring if monochromatic light was used. Interference occurs if the difference in traveled distance between the arms is shorter than the coherence length. This length will increase as your light becomes more monochromatic, even up to centimeters or meters.

However, for OCT this effect is not desirable. You want to scan tissue or the eye or something similar with high resolution. The effect you use is that the interference pattern visibility will go down as the delay approaches the coherence length. It is very easy to see the difference between 100% interference pattern visibility and 50% visibility. If you now use more monochromatic light, the visibility will reduce from 100% to, maybe, 99% for the same distance. This change is not really visible. So for resolving such short distances less monochromatic light is much better.
 
Cthugha said:
Nothing would stop interference from occurring if monochromatic light was used. Interference occurs if the difference in traveled distance between the arms is shorter than the coherence length. This length will increase as your light becomes more monochromatic, even up to centimeters or meters.

However, for OCT this effect is not desirable. You want to scan tissue or the eye or something similar with high resolution. The effect you use is that the interference pattern visibility will go down as the delay approaches the coherence length. It is very easy to see the difference between 100% interference pattern visibility and 50% visibility. If you now use more monochromatic light, the visibility will reduce from 100% to, maybe, 99% for the same distance. This change is not really visible. So for resolving such short distances less monochromatic light is much better.

Thank you very much for your response. I think that I start getting the big image. However, what is still unclear is the physical reason that with monochromatic light even a distance of meters can cause interference, since the near-to-white light can make an interference occur only if there is a very small distance. How does the rage of wavelengths affect it?
 
  • #10
Having less monochromatic light means that more frequencies are present. Having long coherence time means that the phase of the light beam is predictable over a long time. The presence of many frequencies will now cause a slight relative phase shift over time, so that the all the frequencies will be more and more out of phase. Also the relative phase for each frequency tends to randomize on some characteristic time scale.

In more mathematical terms one finds that the first-order coherence function (its decay time is the coherence time) is the Fourier transform of the spectral power spectrum. The narrower your emission is in the spectral range, the longer the coherence time will be and vice versa.
 
  • #11
marco_polo said:
Thank you very much for your response. I think that I start getting the big image. However, what is still unclear is the physical reason that with monochromatic light even a distance of meters can cause interference, since the near-to-white light can make an interference occur only if there is a very small distance. How does the rage of wavelengths affect it?

The basic relation is Dt*Df = 1, where Dt is the coherence time and Df the frequency bandwidth. In terms of distance, since x = ct (c is the speed of light), Dx*Df = c or Dx*Dl/l^2 = 1 where l is the center wavelength and Dl the wavelength bandwidth: using monochromatic sources, interference effects persist over long distances, while broadband sources have highly localized interference effects.
 
Back
Top