- #1
johne1618
- 371
- 0
I understand that the standard cosmological model says that as one goes back in time towards the big bang the Universe is radiation-dominated. This means that the mass/energy density, rho, in the universe is given by the Stephan Boltzmann law:
rho = T^4
where T is the temperature of radiation in the Universe and "=" means proportional.
Now the temperature T is inversely proportional to the size of the Universe R so we have:
T = R^-1
Thus
rho = R^-4
Now the total energy of the Universe, E, is the energy density times R^3:
Therefore:
E = R^-1
Now the entropy of the Universe S is given by its total energy E divided by its temperature T:
S = E / T
S = R^-1 / R^-1 = 1
Thus as you go back to the beginning of the Universe the standard cosmology says the entropy is constant.
But this must be wrong.
Surely we expect the entropy to go to zero as we go back to the big bang?
rho = T^4
where T is the temperature of radiation in the Universe and "=" means proportional.
Now the temperature T is inversely proportional to the size of the Universe R so we have:
T = R^-1
Thus
rho = R^-4
Now the total energy of the Universe, E, is the energy density times R^3:
Therefore:
E = R^-1
Now the entropy of the Universe S is given by its total energy E divided by its temperature T:
S = E / T
S = R^-1 / R^-1 = 1
Thus as you go back to the beginning of the Universe the standard cosmology says the entropy is constant.
But this must be wrong.
Surely we expect the entropy to go to zero as we go back to the big bang?