Is there a deeper connection between thermal and information entropy?

  • Context: Graduate 
  • Thread starter Thread starter revo74
  • Start date Start date
  • Tags Tags
    Entropy Thermodynamics
Click For Summary
SUMMARY

The discussion clarifies the distinction between thermodynamic entropy and information entropy, emphasizing that entropy is not synonymous with disorder. The formula for entropy, S = – Σ P(i) log P(i), illustrates its probabilistic nature, as proposed by Boltzmann and later refined by Shannon. The conversation highlights that while Boltzmann associated entropy with disorder, Shannon's perspective relates it to information, establishing that higher entropy corresponds to greater uncertainty about a system's microstate. The debate ultimately reveals that the terms are semantically different yet conceptually linked, with implications for understanding thermodynamic processes and information theory.

PREREQUISITES
  • Understanding of thermodynamic principles, specifically entropy.
  • Familiarity with Boltzmann's entropy formula and its implications.
  • Knowledge of information theory and Shannon's contributions.
  • Concepts of macrostate and microstate in statistical mechanics.
NEXT STEPS
  • Research the implications of Boltzmann's constant in thermodynamic entropy.
  • Explore the relationship between entropy and the second law of thermodynamics.
  • Study the differences between classical thermodynamics and statistical mechanics.
  • Investigate the applications of information entropy in data science and communication theory.
USEFUL FOR

Students and professionals in physics, particularly those focused on thermodynamics and statistical mechanics, as well as data scientists and information theorists seeking to understand the interplay between entropy concepts.

  • #61
Studiot said:
Such a statemen by itself is not proof of linkage any more than the following proves a linakge between a certain stone in my garden an the USS Forrestal.

The weight of the USS Forrestal is exactly 1.2345658763209 * 109 times the weight of a certain stone in my garden.

That's like saying that E=mc^2 by itself is not proof of linkage between energy and mass any more than your USS Forrestal example. You are interpreting it the wrong way. E=mc^2 is a statement that there is a linkage, not proof that there is a linkage. S=kH is a statement that there is a linkage between thermo entropy S and info entropy H, not a proof.
 
Science news on Phys.org
  • #62
The following is an interpretation of entropy inspired by the video below.I further develop the content and attempt to reconcile the seeming irrelevance of thermo and info entropy. The result is rather successful. Please tell if it works in explaining the equations.
http://www.youtube.com/watch?v=xJf6pHqLzs0"

Lets begin with a ball falling vertically to the ground at certain speed.

Initially the velocity vector of each particle in the ball is almost the same in both magnitude and direction. The velocity vector has a component which is random in direction as the particles are vibrating, but the magnitude of the random vector component is small and insignificant compared to the vertical velocity vector which is same for all particles.

When a ball collides with the ground, the ball particles at contact surface will receive some momentum from particles at ground surface. As the ground particles are vibrating, they also carry a random-direction velocity vector (statistically, NOT individually). That means the momentum vectors (there are many arrows of vector) transferred to the ball particles are also random in direction.

So the colliding ball particles might receive momentum with various combination, with totally same direction like up,up,up,up,up..., a less ordered combination like up,up,up,up,left..., to totally random direction like left,up,left,right,up... or left,up,down,up,left...
As number of disordered combination are much more than ordered combination, for almost 100% of the time the ball will receive disordered momentum directions.

So receiving the random momentums, the magnitude of random part of the velocity vector becomes more significant compared to its initial velocity vector,and entropy is said to be increased. Thermodynamically speaking, entropy is the significance of the random part of particle velocity vector compared to the whole velocity vector.

Let's verify this interpretation with the thermodynamic entropy equation:

dS=\frac{dQ}{T}, assuming the collision occurs on the small scale and does not affect the average KE (ie temperature) of ball particles.

So, after the collision, some ball particles gain some velocity, which results in gain in kinetic energy dQ. Because the direction of gained velocity is random in direction, the more "energy of such kind" they gain, the more significant the magnitude of random velocity will be. In other words, dS is proportional to dQ, which matches the equation.

The amount the significance increment is related to initial velocity of particles. If they already have a very high average velocity (hence very high temperature), the introduction of the random velocity won't bring much change to the particle velocity. In other words, dS is anti-proportional to T, which matches the equation as well.

(note: temperature represents average kinetic energy, which doesn't tell whether the velocities are of random direction or how random they are. But for particles confined in a fixed size container, they are certain to collide with each other and their velocity is statistically random. So for particles confined in a container with fixed size, there is no way to increase their average KE without increasing random velocity.)You could do that the other way:
Tds=dQ
Suppose a body loss some entropy, which means a PORTION (not an amount) or random velocity is loss, so the particles of the body would loss some KE due to decrease in velocity. But entropy only tells the portion, to tell the actual amount of KE loss you need to multiply it by how much KE the particles are having (ie temperature). That is why dQ=Tds.From above parapraphs, thermodynamic entropy is the significance of the random part of particle velocity vector compared to the whole velocity vector.
Let me change "particle velocity vector" to "value" (or information if you like), and the last sentence becomes:
Information entropy is the significance of random part of a value compared to whole value. As I believe info entropy is easy to understand if you don't try to relate it to thermo-entropy (which is what i soon will do), I skip the verification part and leave that to you.

The relationship between thermo-entropy and info-entropy is that, thermo-entropy measures randomness of an very existing phenomena (particle velocity), while info-entropy measures randomness of number. Info-entropy is the mathematical tool, and thermo-entropy is an application of the tool to measure a physical phenomena. The two are both related and unrelated. They are related as thermo-entropy borrows the info-entropy concept. They are unrelated as one tells how particles behave and one tells how numbers behave.

Entropy as statistic phenomena and macroscopic phenomena
Doing work requires particles colliding a wall together (statistically speaking). With increasing randomness in their velocity, the chance of them colliding a wall together become increasingly low, thus doing work becomes statistically impossible. On a macroscopic scale, the statistical impossibility manifests as "less work is done (compared to frictionless predictions)". Futhermore, when entropy of everywhere in the universe increases, the actual work done will be lesser and lesser than predicted, until finally, no work is done.

(Note: the difficulty in learning the notion of entropy comes form:
1. Therm-entropy is related to how come less work is done during a heating process. So learners have very very strong temptation to think entropy as "a measure of how much less work is done" or some kind of energy storing micro-mechanism. As demonstrated by Andrew Manson in post #23 there is no proportionality here.

2. Formalist
Learners usually start learning entropy by looking at the definition, which is of highly condensed language with all the development history from concepts to words to mathematical symbols unmentioned. Without proper conceptual linking an equation would mean no more than "a value equals to something times something")
 
Last edited by a moderator:
  • #63
marmoset said:
Ed Jaynes wrote a lot about this, check out this paper

http://bayes.wustl.edu/etj/articles/theory.1.pdf

Lots of his other papers are online, http://bayes.wustl.edu/etj/node1.html, interesting stuff.

marmoset, thanks for the links, note #74 on thermal efficiency, has in my mind validated the thoughts about multiple heat sinks in a system. I think it is now clear to me how to better describe my thoughts of a two or more system generator.

As for this thread my mind may have jumped ahead of what I have read or understand, but it seems that Boltzmann in general has been thought of as a single particle and single impact event, method of consideration.

Is it possible that a connection between thermal and information entropy goes much further, due to speed and number of electron interactions in the time frame of a single impact of one particle ?
The same as a transformer changes and transmits voltage and current, with some resulting heat value involved, it seems to me that what is happening at the container wall surface and the particle electron cloud area involves magnitudes more of what can be calculated.

Forgive my comments if they are in advance of what others have already said in writings I am about to study in the near future.

Ron
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K