On the relation between quantum and statistical mechanics

Click For Summary
Quantum mechanics in its path-integral form shares a formal similarity with equilibrium statistical mechanics, particularly through the concept of a partition function. John Baez and Blake Pollard's paper introduces "quantropy," a quantum analogue of entropy, which highlights the nuanced differences between various entropy concepts. The discussion critiques the AdS/CFT correspondence, arguing that mathematical similarities between theories do not guarantee they describe the same physical phenomena. It also explores the relationship between information theory and statistical mechanics, emphasizing that shared mathematical expressions do not imply a direct connection. Overall, the conversation underscores the complexity of entropy across different fields and the need for careful distinctions in terminology.
Demystifier
Science Advisor
Insights Author
Messages
14,608
Reaction score
7,219
It is well known that quantum mechanics in the path-integral form is formally very similar to equilibrium statistical mechanics formulated in terms of a partition function. In a relatively recent, very readable and straightforward paper
http://lanl.arxiv.org/abs/1311.0813
John Baez (a well known mathematical physicist) and Blake Pollard develop this formal analogy further by introducing a quantum analogy of entropy, which they call quantropy. I feel that this paper might be interesting and illuminating for many people on this forum.

Another reason for posting it is to make a concealed critique of AdS/CFT correspondence. This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
 
Last edited:
  • Like
Likes atyy, Simon Bridge, vanhees71 and 1 other person
Physics news on Phys.org
Demystifier said:
This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?

Patrick
 
microsansfil said:
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?
In this case the relation is much more than merely formal. Both entropies are really examples of entropy in a semantic sense, which cannot be said for quantropy.
 
Demystifier said:
Both entropies are really examples of entropy in a semantic sense
E.T Jaynes wrote in his book "Probability Theory The Logic of Science"

E.T Jaynes said:
In particular, the association of the word 'information' with entropy expressions seems in retrospect quite unfortunate, because it persists in carrying the wrong impression to so many people.

The function H is called the entropy, or better, the information entropy of the distribution. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupationnal disease of this field is the persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for instance by such observed quantities as pressure, ...

Patrick
 
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy: one statistical and the other thermodynamical. If this is what he wanted to say, then I agree. But statistical Gibbs entropy in thermal physics and statistical Shannon entropy are semantically very much related.
 
Demystifier said:
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy:
May be, yet his goal seem to show that statistical mechanics, communication theory, and a mass of other applications are all instances of a single method of reasoning.

A viewpoint from which thermondynamic entropy and information-therory entropy appear as the same concept ( http://bayes.wustl.edu/etj/articles/theory.1.pdf )

Isn't it the case for the concept of quantropy ?

Patrick
 
Last edited:
microsansfil said:
Isn't it the case for the concept of quantropy ?
No. Entropy is derived from probability, which is a positive quantity. Quantropy is derived from probability amplitude, which is not a positive quantity.
 
Interesting idea. I haven't read the entire paper too closely, but I'm wondering about the correspondence T \mapsto i\hbar. In stat mech, you have a lot of quantities that are derivatives with respect to \beta \propto \frac{1}{T}
For instance: \langle E \rangle = - \frac{d}{d\beta}ln Z
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
 
TeethWhitener said:
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
 
  • #10
Demystifier said:
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
Right, but I'm not sure what insight it adds. Reading the paper, it seems like the authors aren't really sure either. Although, glancing over it again, the partition function has a suggestive form Z=\frac{2\pi \hbar i \Delta t}{m (\Delta x)^2}
so that if we choose \frac{\Delta t}{(\Delta x)^2}=\frac{1}{c} \frac{1}{ \Delta \tilde{x}}
we get Z=\frac{ih}{mc\Delta \tilde{x}}
so that the length parameter \Delta \tilde{x} and the partition function Z define the Compton wavelength of the particle.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 230 ·
8
Replies
230
Views
20K
  • · Replies 43 ·
2
Replies
43
Views
5K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
6K