# Why did the universe cool down?

1. May 26, 2010

### yasar1967

In thermodynamic books I read free expansion does NOT change the temperature of the gas as no work is done against any medium. Therefore initial internal energy equals to final and as internal energy only the function of temperature the temperature does not go down.
I keep reading after the Big Bang the universe expanded and cooled down. But why? the gas did work against a medium which our universe was in? Why the universe cooled down then? Wasn't that an adiabatic process?

2. May 26, 2010

### JDługosz

I've never seen that explained, just given as analogy with common gasses. The reason it happens to common gasses under normal conditions is because there is an attraction between the particles, so they have more kinetic energy when closer together and more potential energy when farther apart.

They might simply be playing loose with the technical terms, e.g. "energy density" clearly goes down as the volume increases and energy stays the same, but is that a classic measure of temperature?

A general motion of the particles away from each other will cause collisions to be weaker. They smash into each other will less energy. What is that properly called?

3. May 26, 2010

### Uncle Al

Heat?

Not trying to be flip, but as energy is added/subtracted to a particle it therefore becomes more energetic and is both more likely to collide with another particle and do so more energetically.

If energy-density drops, so should the quantity of energy available at any given location. Less energy means less-energetic particles, a lessened likelihood of collisions, etc.

Cooling, in other words.

Al

4. May 26, 2010

### Superstring

$$\frac{V_1}{T_1}=\frac{V_2}{T_2}$$

Temperature is proportional to volume. Yes, that equation is an approximation and only works for gases, but you get the point.

5. May 27, 2010

### yasar1967

Then how come in free expansion (to vacuum) temperature does not change?
energy density drops, distance between particles increase, less likelihood of collisions and yet internal energy remains the same which is a function of temperature.

6. May 27, 2010

### rcgldr

You also have radiation of infra-red heat escaping beyond the farthest bit of matter in the universe, and that creates some of the cooling effect.

7. May 27, 2010

### m.e.t.a.

Al, the temperature of an ideal gas only drops upon expansion if the gas did work on its surroundings during that expansion (conservation of energy: $\Delta U = Q + W$.) Since we assume the Universe has no surroundings on which to do work, it follows that Q = 0; W = 0 $\Rightarrow$ $\Delta U = 0$

yasar1967 reinforces the point:

Exactly. For ideal gases, $U \propto T$. U cannot have changed, but we observe that T has. Clearly the Universe is not an ideal gas.

I think JDługosz alludes to the answer:

In the ideal gas model, the only energy that particles possess is kinetic energy. There are no interparticular forces, and therefore there is no potential energy between particles. In the real world, however, as JDługosz points out, particles possess both kinetic and potential energies. Particles which exhibit a net attraction towards each other will gain kinetic energy and lose potential energy when they "fall" towards each other, and vice versa when they drift apart.

All real-life particles exhibit a net attraction via gravity, of course. When massive bodies drift apart, they gain gravitational potential energy and lose kinetic energy. On the small, everyday scale this loss of kinetic energy due to gravity is negligibly small because we are dealing with tiny masses over tiny distances. But on the scale of the expanding Universe I would imagine that this loss of kinetic energy due to gravity could be significant. To go out on a limb: if the early Universe can be modelled as basically a colossal gas cloud (i.e. ~ lots of thermal motion; no planets/clumps of matter) then a net loss of kinetic energy in this cloud would manifest itself as a drop in temperature. As for how significant a cooling effect this would have on the Universe compared to other effects, such as rcgldr's radiation suggestion, I really have no idea!

8. May 31, 2010

### Ich

There's the knockout argument "energy is not conserved in general relativity", but that doesn't address the question. So let's neglect gravity for the sake of clarity, and look at expansion in this toy model.
If you had an ideal gas everywhere, it wouldn't be expanding. Its density and temperature would be constant.

If you had a finite bubble of gas in otherwise empty space, it would start expanding. It's easiest if we neglect interactions and let each molecule float freely:
Only when the fastest particles from the edge of the cloud have passed the observer, and are gone forever, the expansion becomes noticeable. The gas will become locally nonthermal, as it seperates according to particle speed (the fastest particles leaving first), and the local kinetic energy of the particles will in fact decrease. "Local kinetic energy" means the energy of some neighbouring particles as measured in the frame of their center of gravity. So, while the total energy is conserved and the total temperature is constant, at each position you measure smaller relative velocities and smaller temperature (if applicable).
That's because the irregular motion of the initial cloud becomes partially ordered by the expansion, such that you'd locally count part of the kinetic energy as due to bulk relative motion (wrt other parts of the cloud), not as thermal energy. If you'd stop the expansion (say, by placing a box around the cloud), the gas would get mixed again, there will be no local net motion, and all the energy would be attributed to thermal motion again.

And that's quite exactly what happens on an cosmic scale. For an infinite cloud to expand, you'd have to add some underlying motion to the thermal one, such that the average velocity increases proportional to distance. In such a gas, the locally measured temperature would decrease, because more and more of the kinetic energy would be in ordered "net motion", not in unordered thermal motion.

9. Jun 14, 2010

### zewpals

I don't have many credentials to answer this question correctly, BUT I would take a look at the formula E=MC^2.

Heat is a type of energy, right? When the universe was first created, it was practically a bowl of soup, meaning no mass and only energy. As time went on (we're talking about split seconds here haha), mass came from this energy to form elementary particles, such as hadrons and leptons; protons, and much later, atoms.

Simply put, heat was converted to nuclear potential energy in the formation of atoms (and possibly other types of energy in the formation of elementary particles) Much Much later heat could be used to form chemical potential energy in molecules and more nuclear potential energy in larger atoms.

NOW...atoms are only supposed to make up 4.6% of all the matter in the universe(dark energy is 72% and dark matter is 23%....where the hell did normal energy go haha?). This seems to be much too small of a number for all that energy to be stored in. Perhaps the answer lies in others' answers or in dark matter and dark energy. They too could have taken a part in the cooling of the universe.

Again I'm not qualified to answer something like this; it is just a hypothesis someone with 1 college physics course and some astronomy so far has to offer haha.

10. Jun 14, 2010

### Dickfore

First of all, free expansion does not change the temperature of an ideal gas.

Second, the Universe does not undergo a free expansion. The 'expansion' that the universe undergoes is more like dilatation of any arbitrary distance (like the surface of a baloon when it gets inflated, assuming the Universe is a 2-dimensional manifold) and it does not expand in some 'empty space' because the Universe is the whole space-time.

If you look at some simple theromdynamics of the Universe as a whole, you will find that the expansion looks more like an adiabatic expansion.

11. Jun 15, 2010

### Ich

That's what I meant when I said "knockout argument".
It's common to claim that universial expansion is something fundamentally different from everything we know, ant that we cannot understand this and tht feature with our classical intuition.

I find it more worthwile to see how we can understand the cooling nonetheless. So here's the short version of my previous post:
The free expansion of a blob of a hot ideal gas looks everywhere locally like an adiabatic expansion.

12. Jun 16, 2010

### Chronos

A finite amount of mass expanding in unbounded space cools. 'Heat' is a kinematic property of objects in motion. Increase the distance between objects = fewer collisions, fewer collisions = less heat.

13. Jun 16, 2010

### Ich

Heat (Temperature) is associated with the average kinetic energy of a particle rather than its collision frequency. As energy is conserved, so is the average energy per particle and the "temperature".
What happens is that the particles will automatically sort by velocity, so that everywhere the velocity scattering (what's proportional to the actual temperature there) decreases.

14. Jun 16, 2010

### Dickfore

No, it does not. During free expansion, the gas does not exchange hear AND does no work on the environment. According to the first Law of Thermodynamics, then, it's internal energy does not change. As a special case, since the internal energy is only a function of the temperature for an ideal gas, its temperature won't change either. However, this is not true for real gases. Although the total internal energy might remain unchanged, the average distance between the molecules in the expanded gas would be larger than the average distance in the compressed gas and, therefore, the average potential energy between the molecules would increase. This is only possible if the average kinetic energy decreases. Since temperature is a measure of the average kinetic energy of the gas, it means its temperature would actually decrease during this free expansion.

On the other hand, during adiabatic expansion, the gas ONLY does not exchange heat with the environment. According to the First Law of Thermodynamics, the work done by the gas on the environment is equal to the negative change of the internal energy. Since this work must be positive (the gas expands), the internal energy of the gas decreases! Even for ideal gases, then, the average kinetic energy will still decreases (since the total internal energy of an ideal gas is equal to its total kinetic energy with respect to the center of mass frame) and, thus, its temperature would decrease.

15. Jun 17, 2010

### Phrak

I notice that no one has correctly answered the OP. Odd...

16. Jun 17, 2010

### Dickfore

This would imply you do know the correct answer to the OP's question.

17. Jun 17, 2010

### harcel

Expansion into nothing DOES decrease the temperature, as the expansion of the universe is slow, and therefore adiabatic (ideal gas law applies at any given moment, contrary to free expansion, so 'work' is done by the expanding gas). I suggest you don't read books on thermodynamics, but rather on cosmology.

18. Jun 17, 2010

### Phrak

We know that electromagnetic radiation is red shifted in an expanding universe, so we know already that the energy and momentum of the photon is reduced from the time of emission to the time of absorption. The same is true of the energy and momentum of a massive particle. It is also red shifted. It will, on average, exchange less momentum with each subsequent collision, as the shape of the space changes while in transit.

19. Jun 21, 2010

### Ich

It appears to me that you didn't read what I've written.
You're mistaken.
This is supposed to be an answer? Looks like mysticism.