- #1
JustinLevy
- 895
- 1
Let us say we have a universe permeated with a monochromatic photon gas of wavelength L0 at time t0 and the universe is expanding (say for simplicity with a constant rate H, Hubble's Constant). If I sit there and measure the wavelength of the photons as a function of time, what does it look like?
From a dimensional analysis argument I'd expect something like
L = L0 (1 + H(t-t0))
Is this correct? And how would I calculate this directly?
Also, this suggests all frequencies are adjusted by just a multiplicative constant. If so, this seems to suggest a photon gas initially in a blackbody thermal distribution would not stay so (it's distribution would have a different form at a later time). But that can't be right, as (for instance) the CMB is a very nice blackbody distribution. So what am I doing wrong here?
From a dimensional analysis argument I'd expect something like
L = L0 (1 + H(t-t0))
Is this correct? And how would I calculate this directly?
Also, this suggests all frequencies are adjusted by just a multiplicative constant. If so, this seems to suggest a photon gas initially in a blackbody thermal distribution would not stay so (it's distribution would have a different form at a later time). But that can't be right, as (for instance) the CMB is a very nice blackbody distribution. So what am I doing wrong here?
Last edited: