On the relation between quantum and statistical mechanics

  • #1
Demystifier
Science Advisor
Insights Author
Gold Member
12,583
4,948
It is well known that quantum mechanics in the path-integral form is formally very similar to equilibrium statistical mechanics formulated in terms of a partition function. In a relatively recent, very readable and straightforward paper
http://lanl.arxiv.org/abs/1311.0813
John Baez (a well known mathematical physicist) and Blake Pollard develop this formal analogy further by introducing a quantum analogy of entropy, which they call quantropy. I feel that this paper might be interesting and illuminating for many people on this forum.

Another reason for posting it is to make a concealed critique of AdS/CFT correspondence. This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
 
Last edited:
  • Like
Likes atyy, Simon Bridge, vanhees71 and 1 other person

Answers and Replies

  • #2
325
43
This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?

Patrick
 
  • #3
Demystifier
Science Advisor
Insights Author
Gold Member
12,583
4,948
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?
In this case the relation is much more than merely formal. Both entropies are really examples of entropy in a semantic sense, which cannot be said for quantropy.
 
  • #4
325
43
Both entropies are really examples of entropy in a semantic sense
E.T Jaynes wrote in his book "Probability Theory The Logic of Science"

E.T Jaynes said:
In particular, the association of the word 'information' with entropy expressions seems in retrospect quite unfortunate, because it persists in carrying the wrong impression to so many people.

The function H is called the entropy, or better, the information entropy of the distribution. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupationnal disease of this field is the persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for instance by such observed quantities as pressure, ...

Patrick
 
  • #5
Demystifier
Science Advisor
Insights Author
Gold Member
12,583
4,948
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy: one statistical and the other thermodynamical. If this is what he wanted to say, then I agree. But statistical Gibbs entropy in thermal physics and statistical Shannon entropy are semantically very much related.
 
  • #6
325
43
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy:
May be, yet his goal seem to show that statistical mechanics, communication theory, and a mass of other applications are all instances of a single method of reasoning.

A viewpoint from which thermondynamic entropy and information-therory entropy appear as the same concept ( http://bayes.wustl.edu/etj/articles/theory.1.pdf )

Isn't it the case for the concept of quantropy ?

Patrick
 
Last edited:
  • #7
Demystifier
Science Advisor
Insights Author
Gold Member
12,583
4,948
Isn't it the case for the concept of quantropy ?
No. Entropy is derived from probability, which is a positive quantity. Quantropy is derived from probability amplitude, which is not a positive quantity.
 
  • #8
TeethWhitener
Science Advisor
Gold Member
2,254
1,764
Interesting idea. I haven't read the entire paper too closely, but I'm wondering about the correspondence [itex]T \mapsto i\hbar[/itex]. In stat mech, you have a lot of quantities that are derivatives with respect to [tex]\beta \propto \frac{1}{T}[/tex]
For instance: [tex]\langle E \rangle = - \frac{d}{d\beta}ln Z[/tex]
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
 
  • #9
Demystifier
Science Advisor
Insights Author
Gold Member
12,583
4,948
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
 
  • #10
TeethWhitener
Science Advisor
Gold Member
2,254
1,764
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
Right, but I'm not sure what insight it adds. Reading the paper, it seems like the authors aren't really sure either. Although, glancing over it again, the partition function has a suggestive form [tex]Z=\frac{2\pi \hbar i \Delta t}{m (\Delta x)^2}[/tex]
so that if we choose [tex]\frac{\Delta t}{(\Delta x)^2}=\frac{1}{c} \frac{1}{ \Delta \tilde{x}}[/tex]
we get [tex]Z=\frac{ih}{mc\Delta \tilde{x}}[/tex]
so that the length parameter [itex]\Delta \tilde{x}[/itex] and the partition function [itex]Z[/itex] define the Compton wavelength of the particle.
 

Related Threads on On the relation between quantum and statistical mechanics

Replies
3
Views
9K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
4
Views
2K
Replies
22
Views
315
Replies
33
Views
3K
Replies
10
Views
2K
Replies
2
Views
1K
Replies
1
Views
2K
Replies
2
Views
2K
Top