On the relation between quantum and statistical mechanics

Click For Summary

Discussion Overview

The discussion centers on the relationship between quantum mechanics and statistical mechanics, particularly exploring the formal analogies and conceptual distinctions between them. Participants examine the implications of these connections, including the concept of quantropy and its relation to entropy in statistical mechanics and information theory.

Discussion Character

  • Exploratory
  • Debate/contested
  • Technical explanation

Main Points Raised

  • Some participants note the formal similarities between quantum mechanics in path-integral form and equilibrium statistical mechanics, particularly through the lens of a paper by Baez and Pollard introducing the concept of quantropy.
  • Others argue that mathematical correspondence does not imply that two theories describe the same physics, citing examples from information theory and statistical mechanics.
  • One participant emphasizes that both thermodynamic entropy and information-theory entropy are semantically related, while questioning the validity of quantropy as a similar concept.
  • There is a discussion about E.T. Jaynes' perspective on the semantic differences between statistical and thermodynamic entropy, with some agreeing that they are distinct yet related concepts.
  • Participants explore the implications of using Planck's constant in derivatives, with one suggesting that mathematically it poses no issues, while another questions the physical insights gained from such manipulations.
  • One participant highlights a specific form of the partition function and its relation to the Compton wavelength, suggesting a deeper connection between quantum mechanics and statistical mechanics.

Areas of Agreement / Disagreement

Participants express differing views on the relationship between quantropy and traditional entropy concepts, with no consensus reached on the validity or implications of quantropy. The discussion also reflects uncertainty regarding the physical interpretations of mathematical manipulations involving Planck's constant.

Contextual Notes

Participants acknowledge the complexity of the relationship between different types of entropy and the potential for misinterpretation in terminology. The discussion also highlights the need for careful consideration of definitions and contexts when comparing concepts across fields.

Demystifier
Science Advisor
Insights Author
Messages
14,714
Reaction score
7,307
It is well known that quantum mechanics in the path-integral form is formally very similar to equilibrium statistical mechanics formulated in terms of a partition function. In a relatively recent, very readable and straightforward paper
http://lanl.arxiv.org/abs/1311.0813
John Baez (a well known mathematical physicist) and Blake Pollard develop this formal analogy further by introducing a quantum analogy of entropy, which they call quantropy. I feel that this paper might be interesting and illuminating for many people on this forum.

Another reason for posting it is to make a concealed critique of AdS/CFT correspondence. This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
 
Last edited:
  • Like
Likes   Reactions: atyy, Simon Bridge, vanhees71 and 1 other person
Physics news on Phys.org
Demystifier said:
This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?

Patrick
 
microsansfil said:
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?
In this case the relation is much more than merely formal. Both entropies are really examples of entropy in a semantic sense, which cannot be said for quantropy.
 
Demystifier said:
Both entropies are really examples of entropy in a semantic sense
E.T Jaynes wrote in his book "Probability Theory The Logic of Science"

E.T Jaynes said:
In particular, the association of the word 'information' with entropy expressions seems in retrospect quite unfortunate, because it persists in carrying the wrong impression to so many people.

The function H is called the entropy, or better, the information entropy of the distribution. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupationnal disease of this field is the persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for instance by such observed quantities as pressure, ...

Patrick
 
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy: one statistical and the other thermodynamical. If this is what he wanted to say, then I agree. But statistical Gibbs entropy in thermal physics and statistical Shannon entropy are semantically very much related.
 
Demystifier said:
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy:
May be, yet his goal seem to show that statistical mechanics, communication theory, and a mass of other applications are all instances of a single method of reasoning.

A viewpoint from which thermondynamic entropy and information-therory entropy appear as the same concept ( http://bayes.wustl.edu/etj/articles/theory.1.pdf )

Isn't it the case for the concept of quantropy ?

Patrick
 
Last edited:
microsansfil said:
Isn't it the case for the concept of quantropy ?
No. Entropy is derived from probability, which is a positive quantity. Quantropy is derived from probability amplitude, which is not a positive quantity.
 
Interesting idea. I haven't read the entire paper too closely, but I'm wondering about the correspondence T \mapsto i\hbar. In stat mech, you have a lot of quantities that are derivatives with respect to \beta \propto \frac{1}{T}
For instance: \langle E \rangle = - \frac{d}{d\beta}ln Z
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
 
TeethWhitener said:
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
 
  • #10
Demystifier said:
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
Right, but I'm not sure what insight it adds. Reading the paper, it seems like the authors aren't really sure either. Although, glancing over it again, the partition function has a suggestive form Z=\frac{2\pi \hbar i \Delta t}{m (\Delta x)^2}
so that if we choose \frac{\Delta t}{(\Delta x)^2}=\frac{1}{c} \frac{1}{ \Delta \tilde{x}}
we get Z=\frac{ih}{mc\Delta \tilde{x}}
so that the length parameter \Delta \tilde{x} and the partition function Z define the Compton wavelength of the particle.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 230 ·
8
Replies
230
Views
22K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 8 ·
Replies
8
Views
2K