- #1
Mjolnir
- 6
- 0
So I'm in an intro quantum physics course, and while I'm sure this is a really simple problem, I'm just not getting the math to work out here.
Say you excite the atoms of some gas such that they emit light at a wavelength of 5500 angstroms as they fall back to the ground state. Now if the light intensity falls off with I(t) = I*e[tex]^{-r*t}[/tex] (with r = 5*10^(-7) Hz), we can get the time dependence of the wave function to be e[tex]^{-r*t/2}[/tex]*e[tex]^{-i*\omega_{0}*t}[/tex], correct? Now given this, how would you go about finding the spread of wavelengths of the spectral line? Now the idea I would think is to find <E^2> - <E>^2, and then just set the standard deviation of lamda equal to h*c over this? Anyway, as I said the math just isn't working out. So as much as I hate my first post here to be a question, if someone could maybe layout a general process for doing this (without actually plugging in the values, of course; the idea is to understand this on my own), it would be much appreciated.
Say you excite the atoms of some gas such that they emit light at a wavelength of 5500 angstroms as they fall back to the ground state. Now if the light intensity falls off with I(t) = I*e[tex]^{-r*t}[/tex] (with r = 5*10^(-7) Hz), we can get the time dependence of the wave function to be e[tex]^{-r*t/2}[/tex]*e[tex]^{-i*\omega_{0}*t}[/tex], correct? Now given this, how would you go about finding the spread of wavelengths of the spectral line? Now the idea I would think is to find <E^2> - <E>^2, and then just set the standard deviation of lamda equal to h*c over this? Anyway, as I said the math just isn't working out. So as much as I hate my first post here to be a question, if someone could maybe layout a general process for doing this (without actually plugging in the values, of course; the idea is to understand this on my own), it would be much appreciated.
Last edited: