On the relation between quantum and statistical mechanics

In summary, the paper by John Baez and Blake Pollard introduces a quantum analogue of entropy called quantropy. This analogy between entropy and quantropy is formally similar, but the two concepts have different semantic meanings. The paper also makes a concealed critique of the correspondence between AdS/CFT and provides an example of a theory that is similar but not actually describing the same physics.
  • #1

Demystifier

Science Advisor
Insights Author
Gold Member
14,032
6,462
It is well known that quantum mechanics in the path-integral form is formally very similar to equilibrium statistical mechanics formulated in terms of a partition function. In a relatively recent, very readable and straightforward paper
http://lanl.arxiv.org/abs/1311.0813
John Baez (a well known mathematical physicist) and Blake Pollard develop this formal analogy further by introducing a quantum analogy of entropy, which they call quantropy. I feel that this paper might be interesting and illuminating for many people on this forum.

Another reason for posting it is to make a concealed critique of AdS/CFT correspondence. This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
 
Last edited:
  • Like
Likes atyy, Simon Bridge, vanhees71 and 1 other person
Physics news on Phys.org
  • #2
Demystifier said:
This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?

Patrick
 
  • #3
microsansfil said:
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?
In this case the relation is much more than merely formal. Both entropies are really examples of entropy in a semantic sense, which cannot be said for quantropy.
 
  • #4
Demystifier said:
Both entropies are really examples of entropy in a semantic sense
E.T Jaynes wrote in his book "Probability Theory The Logic of Science"

E.T Jaynes said:
In particular, the association of the word 'information' with entropy expressions seems in retrospect quite unfortunate, because it persists in carrying the wrong impression to so many people.

The function H is called the entropy, or better, the information entropy of the distribution. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupationnal disease of this field is the persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for instance by such observed quantities as pressure, ...

Patrick
 
  • #5
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy: one statistical and the other thermodynamical. If this is what he wanted to say, then I agree. But statistical Gibbs entropy in thermal physics and statistical Shannon entropy are semantically very much related.
 
  • #6
Demystifier said:
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy:
May be, yet his goal seem to show that statistical mechanics, communication theory, and a mass of other applications are all instances of a single method of reasoning.

A viewpoint from which thermondynamic entropy and information-therory entropy appear as the same concept ( http://bayes.wustl.edu/etj/articles/theory.1.pdf )

Isn't it the case for the concept of quantropy ?

Patrick
 
Last edited:
  • #7
microsansfil said:
Isn't it the case for the concept of quantropy ?
No. Entropy is derived from probability, which is a positive quantity. Quantropy is derived from probability amplitude, which is not a positive quantity.
 
  • #8
Interesting idea. I haven't read the entire paper too closely, but I'm wondering about the correspondence [itex]T \mapsto i\hbar[/itex]. In stat mech, you have a lot of quantities that are derivatives with respect to [tex]\beta \propto \frac{1}{T}[/tex]
For instance: [tex]\langle E \rangle = - \frac{d}{d\beta}ln Z[/tex]
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
 
  • #9
TeethWhitener said:
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
 
  • #10
Demystifier said:
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
Right, but I'm not sure what insight it adds. Reading the paper, it seems like the authors aren't really sure either. Although, glancing over it again, the partition function has a suggestive form [tex]Z=\frac{2\pi \hbar i \Delta t}{m (\Delta x)^2}[/tex]
so that if we choose [tex]\frac{\Delta t}{(\Delta x)^2}=\frac{1}{c} \frac{1}{ \Delta \tilde{x}}[/tex]
we get [tex]Z=\frac{ih}{mc\Delta \tilde{x}}[/tex]
so that the length parameter [itex]\Delta \tilde{x}[/itex] and the partition function [itex]Z[/itex] define the Compton wavelength of the particle.
 

Suggested for: On the relation between quantum and statistical mechanics

Replies
22
Views
1K
Replies
35
Views
1K
Replies
15
Views
943
Replies
6
Views
979
Replies
22
Views
722
Replies
15
Views
1K
Back
Top