On the relation between quantum and statistical mechanics

In summary, the paper by John Baez and Blake Pollard introduces a quantum analogue of entropy called quantropy. This analogy between entropy and quantropy is formally similar, but the two concepts have different semantic meanings. The paper also makes a concealed critique of the correspondence between AdS/CFT and provides an example of a theory that is similar but not actually describing the same physics.
  • #1
Demystifier
Science Advisor
Insights Author
Gold Member
14,169
6,649
It is well known that quantum mechanics in the path-integral form is formally very similar to equilibrium statistical mechanics formulated in terms of a partition function. In a relatively recent, very readable and straightforward paper
http://lanl.arxiv.org/abs/1311.0813
John Baez (a well known mathematical physicist) and Blake Pollard develop this formal analogy further by introducing a quantum analogy of entropy, which they call quantropy. I feel that this paper might be interesting and illuminating for many people on this forum.

Another reason for posting it is to make a concealed critique of AdS/CFT correspondence. This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
 
Last edited:
  • Like
Likes atyy, Simon Bridge, vanhees71 and 1 other person
Physics news on Phys.org
  • #2
Demystifier said:
This example demonstrates that, just because there is a mathematical correspondence between two theories, doesn't mean that the two theories really describe the same physics.
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?

Patrick
 
  • #3
microsansfil said:
Similar considerations may apply between information theory and statistical mechanics ? The same mathematical expression occurs both in statistical mechanics (thermodynamic entropy) and in information theory (information-theory entropy) does not in itself establish any connection between these fields ?
In this case the relation is much more than merely formal. Both entropies are really examples of entropy in a semantic sense, which cannot be said for quantropy.
 
  • #4
Demystifier said:
Both entropies are really examples of entropy in a semantic sense
E.T Jaynes wrote in his book "Probability Theory The Logic of Science"

E.T Jaynes said:
In particular, the association of the word 'information' with entropy expressions seems in retrospect quite unfortunate, because it persists in carrying the wrong impression to so many people.

The function H is called the entropy, or better, the information entropy of the distribution. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupationnal disease of this field is the persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for instance by such observed quantities as pressure, ...

Patrick
 
  • #5
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy: one statistical and the other thermodynamical. If this is what he wanted to say, then I agree. But statistical Gibbs entropy in thermal physics and statistical Shannon entropy are semantically very much related.
 
  • #6
Demystifier said:
I think Jaynes wanted to say that within thermal physics there are two semantically different notions of entropy:
May be, yet his goal seem to show that statistical mechanics, communication theory, and a mass of other applications are all instances of a single method of reasoning.

A viewpoint from which thermondynamic entropy and information-therory entropy appear as the same concept ( http://bayes.wustl.edu/etj/articles/theory.1.pdf )

Isn't it the case for the concept of quantropy ?

Patrick
 
Last edited:
  • #7
microsansfil said:
Isn't it the case for the concept of quantropy ?
No. Entropy is derived from probability, which is a positive quantity. Quantropy is derived from probability amplitude, which is not a positive quantity.
 
  • #8
Interesting idea. I haven't read the entire paper too closely, but I'm wondering about the correspondence [itex]T \mapsto i\hbar[/itex]. In stat mech, you have a lot of quantities that are derivatives with respect to [tex]\beta \propto \frac{1}{T}[/tex]
For instance: [tex]\langle E \rangle = - \frac{d}{d\beta}ln Z[/tex]
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
 
  • #9
TeethWhitener said:
How much sense does it make to have a derivative with respect to a constant (Planck's constant)?
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
 
  • #10
Demystifier said:
Mathematically, there is no problem with it. Physically, you can replace Planck constant with a true variable y, perform a derivative with respect to y, and then evaluate all quantities at y=h. The result is the same.
Right, but I'm not sure what insight it adds. Reading the paper, it seems like the authors aren't really sure either. Although, glancing over it again, the partition function has a suggestive form [tex]Z=\frac{2\pi \hbar i \Delta t}{m (\Delta x)^2}[/tex]
so that if we choose [tex]\frac{\Delta t}{(\Delta x)^2}=\frac{1}{c} \frac{1}{ \Delta \tilde{x}}[/tex]
we get [tex]Z=\frac{ih}{mc\Delta \tilde{x}}[/tex]
so that the length parameter [itex]\Delta \tilde{x}[/itex] and the partition function [itex]Z[/itex] define the Compton wavelength of the particle.
 

1. What is the difference between quantum mechanics and statistical mechanics?

Quantum mechanics is a branch of physics that describes the behavior of particles on a very small scale, such as atoms and subatomic particles. It is based on the principles of quantum theory, which states that particles can exist in multiple states at the same time and can exhibit wave-like properties. On the other hand, statistical mechanics is a branch of physics that explains the behavior of large systems made up of many particles, such as gases and liquids. It uses statistical methods to calculate the properties of these systems based on the behavior of individual particles.

2. How do quantum mechanics and statistical mechanics relate to each other?

The relation between quantum mechanics and statistical mechanics is that they both describe different aspects of the behavior of particles. Quantum mechanics provides a microscopic description of individual particles, while statistical mechanics provides a macroscopic description of large systems made up of these particles. The principles of quantum mechanics are used to derive the equations and laws of statistical mechanics, making it an important tool for understanding the behavior of complex systems.

3. What are the main concepts in quantum mechanics and statistical mechanics?

The main concept in quantum mechanics is the wave function, which describes the probability of finding a particle in a certain state. In statistical mechanics, the main concept is the distribution function, which describes the probability of finding a system in a certain state. Both of these concepts are used to calculate the properties of particles and systems, respectively.

4. How do quantum mechanics and statistical mechanics explain the behavior of matter?

Quantum mechanics explains the behavior of matter on a microscopic scale, such as the behavior of individual particles and their interactions. It also explains phenomena such as wave-particle duality and quantum entanglement. On the other hand, statistical mechanics explains the behavior of matter on a macroscopic scale, such as the properties of gases and liquids. It also explains phenomena such as phase transitions and the behavior of thermodynamic systems.

5. What are some practical applications of the relation between quantum mechanics and statistical mechanics?

The relation between quantum mechanics and statistical mechanics has many practical applications in fields such as materials science, nanotechnology, and chemistry. It is used to understand and predict the properties of materials at the atomic and molecular level, which is essential for developing new technologies and materials. It is also used in fields such as quantum computing and quantum cryptography, which utilize the principles of quantum mechanics to perform operations and transmit information at a quantum level.

Similar threads

Replies
15
Views
1K
Replies
1
Views
1K
Replies
8
Views
1K
Replies
4
Views
1K
Replies
230
Views
16K
Replies
36
Views
3K
  • Quantum Physics
Replies
8
Views
1K
Replies
43
Views
4K
Replies
3
Views
783
Back
Top