- #1
Heldo Jelbar
- 2
- 0
Hello all! I'm trying to understand the standard normalisation of the scale factor to be set to 1 at today's time. Looking at the first Friedmann Equation for a spatially flat Robertson Walker metric with no cosmological constant gives
[tex] \frac{\dot{a}^2}{a^2} = \frac{8\pi G}{3}\rho [/tex]
If we wanted to see how the density of the universe changed from the beginning of the matter dominated era to today, we would set
[tex] a(t) = t^{2/3} [/tex]
This means that,
[tex] \frac{\dot{a}^2}{a^2} = \frac{4}{9t^2} [/tex]
inserting this back into the Friedmann Equation, we get
[tex] \rho = \frac{1}{6\pi Gt^2} [/tex]
So we see that in a expanding universe the density decreases as [itex] 1/t^2 [/itex], which is sensible. But my question is this: if we normalise the scale factor [itex] a(t) [/itex] such that [itex] a(t_0) = 1 [/itex], where [itex] t_0 [/itex] is today's time, then one way of doing this is to use units where [itex] t_0 = 1 [/itex]. This then would make [itex] a(t_0) = 1 [/itex] straightforwardly for any power law expansion of scale factor. But normalising the scale factor in this way messes with the density time relation. As all times in the past have [itex] t< 1 [/itex], a [itex] 1/t^2 [/itex] relation will actually show that the density is INCREASING in time as the universe expands, as [itex]t[/itex] is less than one before today. But this is no longer sensible.
So does anyone know how to correctly normalise the scale factor to avoid this issue? Any answers with their justifications would be great, and a reference to where I can read more about this would be even better! Many many thanks in advance!
[tex] \frac{\dot{a}^2}{a^2} = \frac{8\pi G}{3}\rho [/tex]
If we wanted to see how the density of the universe changed from the beginning of the matter dominated era to today, we would set
[tex] a(t) = t^{2/3} [/tex]
This means that,
[tex] \frac{\dot{a}^2}{a^2} = \frac{4}{9t^2} [/tex]
inserting this back into the Friedmann Equation, we get
[tex] \rho = \frac{1}{6\pi Gt^2} [/tex]
So we see that in a expanding universe the density decreases as [itex] 1/t^2 [/itex], which is sensible. But my question is this: if we normalise the scale factor [itex] a(t) [/itex] such that [itex] a(t_0) = 1 [/itex], where [itex] t_0 [/itex] is today's time, then one way of doing this is to use units where [itex] t_0 = 1 [/itex]. This then would make [itex] a(t_0) = 1 [/itex] straightforwardly for any power law expansion of scale factor. But normalising the scale factor in this way messes with the density time relation. As all times in the past have [itex] t< 1 [/itex], a [itex] 1/t^2 [/itex] relation will actually show that the density is INCREASING in time as the universe expands, as [itex]t[/itex] is less than one before today. But this is no longer sensible.
So does anyone know how to correctly normalise the scale factor to avoid this issue? Any answers with their justifications would be great, and a reference to where I can read more about this would be even better! Many many thanks in advance!