B Entropy -- What exactly is it?

  • B
  • Thread starter Thread starter Levi Woods
  • Start date Start date
  • Tags Tags
    Entropy
Levi Woods
Messages
5
Reaction score
1
I am still confused on what exactly entropy is, any help?
 
Physics news on Phys.org
Moderator's note: Thread level changed to "B".
 
@Levi Woods what sources have you tried to consult about entropy? What have you found difficult to understand about what they say? Your question is much too broad as it stands; focusing in on particular areas of difficulty will make it much more likely that you'll get useful responses.
 
PeterDonis said:
@Levi Woods what sources have you tried to consult about entropy? What have you found difficult to understand about what they say? Your question is much too broad as it stands; focusing in on particular areas of difficulty will make it much more likely that you'll get useful responses.

I've just read Wikipedia articles which aren't very helpful, I mean my understanding is that it's the chance that an atom will be in certain place.
 
Levi Woods said:
I've just read Wikipedia articles which aren't very helpful, I mean my understanding is that it's the chance that an atom will be in certain place.

I'm afraid this is still too broad. Please pick out a specific source and explain why you didn't find it helpful.
 
Levi Woods said:
I am still confused on what exactly entropy is, any help?
Look at this
IMG_20190103_091522.jpeg
 

Attachments

  • IMG_20190103_091522.jpeg
    IMG_20190103_091522.jpeg
    52.1 KB · Views: 543
Hm, it's a bit bad to have this thread put to B-level. It's nearly impossible to answer it at B-level. So let me give an answer at I-level.

Within a modern approach, entropy is understood as a measure for the missing information, given a probatility distribution.

It turns out that the measure for a discrete set of possible outcomes in a random experiment with equal probability for each outcome (like throughing a fair dice, where the probability for each outcome is 1/6), i.e., ##P_i=1/N##, where ##N## is the number of possible outcomes is
$$S=k_{\text{B}} \ln N.$$
Starting from this result by Shannon, you can derive the more general result that for a given probabilities ##P_i##
$$S=-k_{\text{B}} \sum_i P_i \ln(P_i).$$
It is understood here that for ##P_i=0## one has to put ##P_i \ln(P_i)=0##.

In quantum mechanics the probability distributions are given by the Statistical Operator, which represents the state, and then this formula writes (first established by von Neumann)
$$S=-k_{\text{B}} \mathrm{Tr}(\hat{\rho} \ln \hat{\rho}).$$
For the special case that the system is prepared in a pure state one has ##\hat{\rho}=|\Psi \rangle \langle \Psi|##. Now you can use a complete orthonormal set ##|u_i \rangle## containing ##|u_1 \rangle=|\Psi \rangle## as a member, and then the trace becomes
$$S=-k_B 1 \ln 1=0,$$
which tells you that the knowledge that the system is prepared in a pure state is complete knowledge, as it should be in QT.

For more details about the information-theoretical approach to statistical physics, see either the Book by Jochen Rau, I've recommended before. I've also a short manuscript using this concept:

https://th.physik.uni-frankfurt.de/~hees/publ/stat.pdf
 

Similar threads

Replies
4
Views
1K
Replies
18
Views
3K
Replies
39
Views
4K
Replies
3
Views
3K
Replies
2
Views
1K
Replies
10
Views
2K
Back
Top