Why is the phenomenon of increasing entropy not as straightforward as it seems?

In summary, entropy is the measure of a system's disorder, and it increases when energy is transferred from a more energetic to a less energetic region. This intuition is correct, and it is supported by a mathematical theorem.
  • #1
litewave
21
0
Can't the phenomenon of increasing entropy be explained as a result of the fact that in a collision of two particles the higher-energy particle always passes energy to the lower-energy particle (and never vice versa)? Hence energy becomes more evenly distributed in space...
 
Science news on Phys.org
  • #2
Hmm. Can entropy be defined purely as the way energy is distributed in space ?
 
  • #3
Yes, that's how I understand it. Via particle collisions, energy flows spontaneously from a hot (i.e. highly energetic) region to a cold (less energetic) region and as a result energy becomes evenly distributed across the two regions and the temperature of the two regions becomes equal.

The total energy passed from the hot region to the cold region is Q, the beginning temperature of the hot region is T1 and the beginning temperature of the cold region is T2. So during the energy transfer, the hot region's entropy decreases by Q/T1 (approximately, since the temperature doesn't drop instantly) and the cold region's entropy increases by Q/T2. Since T1>T2, total entropy of the system increases.
 
  • #4
though I am not sure i don't think definition is right because it wouldn't support the "cold death" hypothesis for the universe
 
  • #5
Some salient characteristics of "entropy"

litewave said:
Can't the phenomenon of increasing entropy be explained as a result of the fact that in a collision of two particles the higher-energy particle always passes energy to the lower-energy particle (and never vice versa)? Hence energy becomes more evenly distributed in space...

Note that entropy is the central technical term in both statistical thermodynamics, a physical theory, and information theory, a (highly applicable!) mathematical theory allied to ergodic theory. It is essential to recognize that in both areas, many entropies are matheamtically defined and these definitions have a rather complicated mathematical relationship to one another. Thus, it is never proper to refer to "entropy" without qualification. Unfortunately, this bad habit is standard in the research literature, which is the cause of perennial and entirely unneccessary confusion :grumpy:

More generally, please note that IMO the intersection between information theory and statistical thermodynamics is probably the most difficult thing in modern physics, involving as it does deep philosphical, technical, and interpretive problems. So don't underestimate the importance of being very very careful in how you think and write.

Fortunately, despite some sloppiness of terminology, you appear to have hit upon some correct intuition. First, physical systems which can do useful work must have some (possibly abstract) "inhomogeneity" which can be "leveraged" (possibly in some abstract sense) to run some kind of (possibly idealized) "engine". Second, entropy does increase when one "homogenizes" a system (possibly in some abstract but deterministic sense, or possibly in a statistical sense). If you adopt a mathematical notion of entropy closely allied to the notions most often encountered, you can prove a lemma to this effect (but only valid for this particular notion of entropy).

One of the simplest mathematical definitions of "entropy" occurs in a very simple setting: partitions of a finite set (no probability measure needed, or if you prefer, we impose "counting measure"). Boltzmann suggested defining the entropy of such the partition [itex]\pi[/itex] as the log of the multinomial coefficient
[tex] S(\pi) =
\log \left(
\begin{array}{ccccc} & & n & & \\
n_1 & n_2 & \dots & n_{r-1} & n_r
\end{array} \right)
= \log \, \frac{n!}{n_1! \, n_2! \dots n_{r-1}! \, n_r!}
[/tex]
where the [itex]n_j[/itex] are the sizes of the r blocks of the partition, so that [itex]n = n_1 + n_2 + \dots n_r[/itex]. For convenience we can allow zero values for the [itex]n_j[/itex], as long as this sum condition holds (and [itex]n>0[/itex]).

Exercise: Assume that not all blocks have the same size. Suppose with little loss of generality that [itex]n_1 > n_2 +1 > 0[/itex]. What happens to the Boltzmann entropy if [itex]n_1 \rightarrow n_1-1, \; n_2 \rightarrow n_2+1[/itex]?

Exercise: read Weeks 247-250 and 252 (the parts dealing with Kleinian geometry) of This Weeks Finds http://www.math.ucr.edu/home/baez/TWF.html
Recalling that a Kleinian geometry arises whenever we have a group acting on a set, can you recognize a Kleinian geometry lurking behind Boltzmann's entropy?

Exercise: Try to discover and prove further formal properties of Boltzmann entropy. Can you find "categorify" these to find analogous formal properties at the level of Kleinian geometry? Are these more general?

Exercise: If you are familiar with another mathematical definition of "entropy", ditto.

Exercise: If you are not familiar with any other mathematical definitions of "entropy", apply Stirling's approximation to the Boltzmann entropy and try to find some formal properties of the quantity you come up with. What is the natural mathematical setting for its definition?

If any of this tickles your fancy, I recommend that you study one of the greatest (and most readable and most fun!) scientific papers of all time:
http://www.math.uni-hamburg.de/home/gunesch/Entropy/shannon.ps

Then read Thomas and Cover, Elements of Information Theory, Wiley, 1981. (By far the best introduction, the only one which adequately conveys some sense of the vast scope of this theory, which ranks with calculus, probability, and linear algebra, as one of the most applicable of all mathematical theories.) See http://www.math.uni-hamburg.de/home/gunesch/entropy.html for more fun on-line stuff.

A good book for anyone utterly baffled by the above is Lawrence Sklar, Space, Time, and Spacetime.

(Urgent warning!: articles on information theory and physics in the Wikipedia have been the subject of a bitter edit war between a lone dissident with very odd and inchoate but very determined views, and everyone else. No, don't ask, but I would urge anyone to strictly avoid reading Wikipedia articles on this topic until you know a great deal already, since a naive student could easily be misled and eventually have to try to "unlearn" vast amounts of misinformation. Much much better to stick with reliable sources while you are still a student!)
 
Last edited:
  • Like
Likes Charlie313
  • #6
Chris Hillman said:
The idea of "heat death" is that the universe has been homogenized, so that no free energy gradient is possible. "Cold death" is a somewhat silly term referring to a future epoch when free energy gradients have become too diffuse to support life as we know it, or possibly (eventually) too diffuse to run any idealized information processing device. So from our perspective, they are pretty similar.

hmm so entropy obeys conservation of energy?
 
  • #7
Chris Hillman said:
Of course not. That doesn't even make sense and I can't imagine why you thought anything I wrote would imply this.

that's exactly what your post implies, that energy is conserved but become impotent.

edit

maybe my understanding of energy is elementary, by virtue of its inability to do work it is no longer energy is what you mean.
 
  • #8
Chris Hillman said:
Maybe you simply need to express yourself more clearly. As I already noted, this is a subtle and extremely complex subject, so clarify of expression is more important, not less so.
does the total amount of energy in the closed system of the universe remain constant?
 
  • #9
Chris Hillman said:
It turns out that this is not an easy question to discuss, and if you don't know this, it is safe to assume that you lack the background required to try to explain why not.

Did you search for past threads on this? If you don't find anything satisfactory, why not ask about that in a new thread in the Cosmology subforum? Due to the complexity of the subject, I'd prefer that we try to keep this thread more or less on topic. TIA

no offense but telling me to rifle through other threads of abstruse information doesn't lend to my understanding. a glib answer would have sufficed.
 
  • #10
Chris Hillman said:
Gee whiz, ice, aren't you being a little harsh? I dislike glib answers and unless I expended more time and energy than I currently have available, it's not clear that my glib answer would be better than the one offered in the WP article I cited. And please, no offense backatcha, but shouldn't you be willing to make an effort yourself?

i'm faced with a problem indemnic to people like me. i yearn to know advanced topics but to know these advanced topics i need the foundation. I'm working on a foundation, I'm a beginning physics major. for now though if i am curious about something it is pointless to try to read scholarly material because it is very opaque to me. so if i need a curiosity satisfied the best that i can hope for is a simplified answer.

though i can understand if and why you would unwilling to provide one, it is still very stultifying
 
  • #12
ice109 said:
no offense but telling me to rifle through other threads of abstruse information doesn't lend to my understanding. a glib answer would have sufficed.

Searching for an answer would have helped you out more. That is all he is trying to hint at. If you are not willing to at least look at some threads with similar issues discussed it must not be all that important.
 
  • #13
Thanks Chris, I thought that the phenomenon of increasing entropy, or irreversible spontaneous dispersal of energy, could be explained simply from the fact that a more energetic particle always passes energy to a less energetic one, without any statistical ambiguity. Seems the matter is more complicated though
 
  • #14
Entropy comments

litewave said:
Can't the phenomenon of increasing entropy be explained as a result of the fact that in a collision of two particles the higher-energy particle always passes energy to the lower-energy particle (and never vice versa)? Hence energy becomes more evenly distributed in space...


If the enenrgy would be transfrerred from high-energy particle to lower ones, then the all the chemical reactions in the universe would've gone to completion, but it does not work this way. THe reaction comes to halt when the entropy generated is maximum, i.e. the products formed and the reactants left must have a high disorderness created. This is where the reaction is under equilibrium.

According to what I've read in the book, change in entropy can be negative- which means that energy can be transferred from lower energy reservoir to a higher one, this requires some work in. But the entropy generated can't be negative in real life due to the irreversibilities present, like friction.

If we consider an isloated universe (adiabatic) in which there are subsystems. Though we are not able to transfer heat to the system but even then the entropy of the universe (system) is generated due to irreversibilities present during the process going on within the system.

Final Comment: THe energy of he particles in the rxn is conserved but the entropy is generated or created during rxn, i.e. at equilibrium pt this causes the rxn to stop from proceeding furhter (or go back) w/o reacting all the high energy molecules.

I hope this helps u.
 
  • #15
litewave said:
Thanks Chris, I thought that the phenomenon of increasing entropy, or irreversible spontaneous dispersal of energy, could be explained simply from the fact that a more energetic particle always passes
energy to a less energetic one, without any statistical ambiguity.
Seems the matter is more complicated though

Yes, as sd_barry said, that's not really true. BTW, maximizing entropy is Legendre-dual to miniming free energy. In ancient days I used to have a post archived at my website which explained Legendre duality in two ways, one geometric and the other analytic.

Another comment on what sd_barry said: entropy in statistical mechanics is properly understood against the background of ergodic theory. The book by Lawrence Slkar, Space, Time, and Spacetime has some excellent nontechnical discussion of the Poincare recurrence theorem, which bears directly upon the Second Law.
 

1. Why is entropy always increasing?

There are several reasons why entropy always increases. One reason is the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time. Another reason is that natural processes tend to move towards a state of maximum entropy, where energy is evenly dispersed and no work can be done. Additionally, the increase in entropy is a result of the random and chaotic nature of particles in a system, which leads to an increase in disorder over time.

2. How does the increase in entropy affect our daily lives?

The increase in entropy has a significant impact on our daily lives. It is the driving force behind many natural processes, such as the flow of heat, the diffusion of molecules, and the decay of organic matter. It also explains why it is difficult to maintain order and organization in our surroundings, and why energy is required to do so. The increase in entropy also has implications for the future of our universe, as it suggests a gradual decline towards disorder and equilibrium.

3. Can entropy ever decrease?

While it is possible for the entropy of a small, local system to decrease, the total entropy of a closed system will always increase. This is due to the second law of thermodynamics, which states that the total entropy of a closed system will never decrease over time. In other words, while entropy can be temporarily reduced in one area, it will always increase in another area, resulting in a net increase in entropy.

4. How does entropy relate to information and knowledge?

Entropy and information are closely related concepts. In thermodynamics, entropy is a measure of the amount of disorder and randomness in a system. In information theory, entropy is a measure of the uncertainty or randomness in a message or data. As entropy increases, the amount of information or knowledge decreases, and vice versa. This relationship is important in fields such as computer science and communication, where the goal is to reduce entropy and increase information.

5. Is there a maximum limit to entropy?

According to the third law of thermodynamics, there is a maximum limit to entropy, known as the absolute entropy. This is the entropy of a perfect crystal at absolute zero temperature. At this point, the particles in the crystal are in their lowest possible energy state, and there is no disorder or randomness. However, this state is theoretically impossible to reach, and in practical terms, the increase in entropy will continue until equilibrium is reached, and all energy in the universe is evenly dispersed.

Similar threads

  • Thermodynamics
Replies
3
Views
1K
Replies
13
Views
1K
  • Thermodynamics
Replies
4
Views
353
Replies
17
Views
1K
  • Thermodynamics
Replies
1
Views
724
  • Thermodynamics
Replies
17
Views
924
Replies
1
Views
895
  • Thermodynamics
Replies
13
Views
1K
Replies
2
Views
838
Replies
9
Views
6K
Back
Top