Second law of thermo- dS = δq /T Getting what I think straight: it was extremely hot at the big bang, and all matter existed within the "sphere" or whatever you wish to call the point. There was nothing outside this, so the average temperature was incredibly high. As the universe expanded, the mass just spread out, forming into quarks, then protons, then atoms, etc. The kinetic energy spread out as well. I assume that the energy lost in the cooling process was transferred into the energy used in expanding, which is why the universe is still cooling, because it is still expanding. Basically, I'm asking why entropy is increasing if the temperature of the universe is decreasing. Is entropy a function of energy, not temperature then? And if so, then why is it not constant?