How does measurement time affect laser linewidth measurements?

anoegenetic
Messages
3
Reaction score
0
I am trying to measure the linewidth of a diode laser using the self-heterodyne technique. The laser specs says that the linewidth is ~100kHz at 1us. I guess I am confused by the "at 1us" and how exactly that translates to measuring the linewidth myself by looking at the beat note on a spectrum analyzer. If I want to make a comparable measurement to what is cited in the manual, what should I set my sweep time to be? The bandwidth? The resolution? I guess what I don't really understand is how the linewidth changes as you change the measurement time.
 
I'll give a brief answer. It's not that strange that your measurement duration affects the linewidth if you think about different effects that causes line broadening. On short time scale you have fast frequency/phase shift caused e.g. by change electric fields, on intermediate time scales acustic noise might dominate and on slow time scales things like temperature drifts come in. So, the longer time you measure for, the more sources for frequency shifts come into play.

Thus if you want to measure the 100 kHz they stated, you need to do a total measurement time/averaging of 1 μs. Also, keep in mind that if you measure with an interferometric setup where the same laser is in both arms, as it appears in your case, then your two arms will not have uncorrelated signals, meaning your measurement of the linewidth will be flawed. Perhaps still good enough for what you want, but often linewidths are measured against an external source (cavity or second laser).
 
Thanks for responding Zarqon! So if I understand correctly, you're saying that measuring for longer times will typically result in a measured linewidth that is bigger than what you would measure if you averaged for a shorter time?
 
Back
Top