Some salient characteristics of "entropy"
litewave said:
Can't the phenomenon of increasing entropy be explained as a result of the fact that in a collision of two particles the higher-energy particle always passes energy to the lower-energy particle (and never vice versa)? Hence energy becomes more evenly distributed in space...
Note that
entropy is the central technical term in both
statistical thermodynamics, a physical theory, and
information theory, a (highly applicable!) mathematical theory allied to
ergodic theory. It is essential to recognize that in both areas, many entropies are matheamtically defined and these definitions have a rather complicated mathematical relationship to one another. Thus, it is never proper to refer to "entropy" without qualification. Unfortunately, this bad habit is standard in the research literature, which is the cause of perennial and entirely unneccessary confusion
More generally, please note that IMO the intersection between information theory and statistical thermodynamics is probably the most difficult thing in modern physics, involving as it does deep philosphical, technical, and interpretive problems. So don't underestimate the importance of being very very careful in how you think and write.
Fortunately, despite some sloppiness of terminology, you appear to have hit upon some correct intuition. First, physical systems which can do useful work must have some (possibly abstract) "inhomogeneity" which can be "leveraged" (possibly in some abstract sense) to run some kind of (possibly idealized) "engine". Second, entropy does increase when one "homogenizes" a system (possibly in some abstract but deterministic sense, or possibly in a statistical sense). If you adopt a mathematical notion of entropy closely allied to the notions most often encountered, you can prove a lemma to this effect (but only valid for this particular notion of entropy).
One of the simplest mathematical definitions of "entropy" occurs in a very simple setting: partitions of a finite set (no probability measure needed, or if you prefer, we impose "counting measure"). Boltzmann suggested defining the entropy of such the partition \pi as the log of the multinomial coefficient
S(\pi) = <br />
\log \left( <br />
\begin{array}{ccccc} & & n & & \\<br />
n_1 & n_2 & \dots & n_{r-1} & n_r<br />
\end{array} \right)<br />
= \log \, \frac{n!}{n_1! \, n_2! \dots n_{r-1}! \, n_r!}<br />
where the n_j are the sizes of the r blocks of the partition, so that n = n_1 + n_2 + \dots n_r. For convenience we can allow zero values for the n_j, as long as this sum condition holds (and n>0).
Exercise: Assume that not all blocks have the same size. Suppose with little loss of generality that n_1 > n_2 +1 > 0. What happens to the Boltzmann entropy if n_1 \rightarrow n_1-1, \; n_2 \rightarrow n_2+1?
Exercise: read Weeks 247-250 and 252 (the parts dealing with Kleinian geometry) of This Weeks Finds
http://www.math.ucr.edu/home/baez/TWF.html
Recalling that a Kleinian geometry arises whenever we have a group acting on a set, can you recognize a Kleinian geometry lurking behind Boltzmann's entropy?
Exercise: Try to discover and prove further formal properties of Boltzmann entropy. Can you find "categorify" these to find analogous formal properties at the level of Kleinian geometry? Are these more general?
Exercise: If you are familiar with another mathematical definition of "entropy", ditto.
Exercise: If you are not familiar with any other mathematical definitions of "entropy", apply Stirling's approximation to the Boltzmann entropy and try to find some formal properties of the quantity you come up with. What is the natural mathematical setting for its definition?
If any of this tickles your fancy, I recommend that you study one of the greatest (and most readable and most fun!) scientific papers of all time:
http://www.math.uni-hamburg.de/home/gunesch/Entropy/shannon.ps
Then read Thomas and Cover,
Elements of Information Theory, Wiley, 1981. (By far the best introduction, the only one which adequately conveys some sense of the vast scope of this theory, which ranks with calculus, probability, and linear algebra, as one of the most applicable of all mathematical theories.) See
http://www.math.uni-hamburg.de/home/gunesch/entropy.html for more fun on-line stuff.
A good book for anyone utterly baffled by the above is Lawrence Sklar,
Space, Time, and Spacetime.
(
Urgent warning!: articles on information theory and physics in the Wikipedia have been the subject of a bitter edit war between a lone dissident with very odd and inchoate but very determined views, and everyone else. No, don't ask, but I would urge anyone to strictly avoid reading Wikipedia articles on this topic until you know a great deal already, since a naive student could easily be misled and eventually have to try to "unlearn" vast amounts of misinformation. Much
much better to stick with reliable sources while you are still a student!)