Entropy -- What exactly is it?

  • Context: High School 
  • Thread starter Thread starter Levi Woods
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary

Discussion Overview

The discussion centers around the concept of entropy, with participants expressing confusion about its definition and implications. The scope includes theoretical understanding and interpretations from statistical mechanics and quantum mechanics.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about entropy and seeks clarification.
  • Another participant suggests that the original question is too broad and encourages focusing on specific areas of difficulty.
  • A participant mentions that their understanding of entropy relates to the probability of an atom being in a certain place, but finds this explanation insufficient.
  • A later reply provides a technical explanation of entropy as a measure of missing information based on probability distributions, referencing formulas from statistical mechanics and quantum mechanics.
  • The technical explanation includes specific mathematical expressions for entropy in different contexts, such as discrete outcomes and quantum states.
  • Further resources are suggested for a deeper understanding of the information-theoretical approach to statistical physics.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the definition of entropy, with multiple interpretations and levels of understanding expressed. The discussion remains unresolved regarding a clear, universally accepted explanation of entropy.

Contextual Notes

Some participants indicate that existing sources, such as Wikipedia, do not provide adequate clarity, highlighting the need for more specific or detailed explanations. The discussion reflects varying levels of familiarity with the mathematical and conceptual aspects of entropy.

Levi Woods
Messages
5
Reaction score
1
I am still confused on what exactly entropy is, any help?
 
Physics news on Phys.org
Moderator's note: Thread level changed to "B".
 
@Levi Woods what sources have you tried to consult about entropy? What have you found difficult to understand about what they say? Your question is much too broad as it stands; focusing in on particular areas of difficulty will make it much more likely that you'll get useful responses.
 
PeterDonis said:
@Levi Woods what sources have you tried to consult about entropy? What have you found difficult to understand about what they say? Your question is much too broad as it stands; focusing in on particular areas of difficulty will make it much more likely that you'll get useful responses.

I've just read Wikipedia articles which aren't very helpful, I mean my understanding is that it's the chance that an atom will be in certain place.
 
Levi Woods said:
I've just read Wikipedia articles which aren't very helpful, I mean my understanding is that it's the chance that an atom will be in certain place.

I'm afraid this is still too broad. Please pick out a specific source and explain why you didn't find it helpful.
 
Levi Woods said:
I am still confused on what exactly entropy is, any help?
Look at this
IMG_20190103_091522.jpeg
 

Attachments

  • IMG_20190103_091522.jpeg
    IMG_20190103_091522.jpeg
    52.1 KB · Views: 580
Hm, it's a bit bad to have this thread put to B-level. It's nearly impossible to answer it at B-level. So let me give an answer at I-level.

Within a modern approach, entropy is understood as a measure for the missing information, given a probatility distribution.

It turns out that the measure for a discrete set of possible outcomes in a random experiment with equal probability for each outcome (like throughing a fair dice, where the probability for each outcome is 1/6), i.e., ##P_i=1/N##, where ##N## is the number of possible outcomes is
$$S=k_{\text{B}} \ln N.$$
Starting from this result by Shannon, you can derive the more general result that for a given probabilities ##P_i##
$$S=-k_{\text{B}} \sum_i P_i \ln(P_i).$$
It is understood here that for ##P_i=0## one has to put ##P_i \ln(P_i)=0##.

In quantum mechanics the probability distributions are given by the Statistical Operator, which represents the state, and then this formula writes (first established by von Neumann)
$$S=-k_{\text{B}} \mathrm{Tr}(\hat{\rho} \ln \hat{\rho}).$$
For the special case that the system is prepared in a pure state one has ##\hat{\rho}=|\Psi \rangle \langle \Psi|##. Now you can use a complete orthonormal set ##|u_i \rangle## containing ##|u_1 \rangle=|\Psi \rangle## as a member, and then the trace becomes
$$S=-k_B 1 \ln 1=0,$$
which tells you that the knowledge that the system is prepared in a pure state is complete knowledge, as it should be in QT.

For more details about the information-theoretical approach to statistical physics, see either the Book by Jochen Rau, I've recommended before. I've also a short manuscript using this concept:

https://th.physik.uni-frankfurt.de/~hees/publ/stat.pdf
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 39 ·
2
Replies
39
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K