# B Born rule and thermodynamics

1. Jun 1, 2017

### Blue Scallop

Some say there is analogy in the Born rule in thermodynamics where the particles locations depend on the probabilities. In QM, the amplitude square is where you have the probability of the particles being there. How about in thermodynamics.. what is the counterpart of the Born rules, and what regions in a thermodynamics do the probabilities vary? In a box with the division at center removed, all the particles would go to the other side.. so what experimental setup or system can you make where the probabilities of the regions vary (as analogy to Born rule)?

2. Jun 1, 2017

### vanhees71

The general Born rule is
$$\langle A \rangle = \mathrm{Tr} (\hat{\rho} \hat{A}),$$
where $A$ is an observable, $\hat{A}$ its representing (usually self-adjoint) operator, and $\hat{\rho}$ the statistical operator of the system.

In thermodynamics, i.e., thermal equilibrium for the grand-canonical ensemble you have the statistical operator
$$\hat{\rho}=\frac{1}{Z} \exp[-\beta (\hat{H}-\sum_j \mu_j \hat{Q}_j)]$$
with the partition sum
$$Z=\mathrm{Tr} \exp[-\beta (\hat{H}-\sum_j \mu_j \hat{Q}_j)].$$
$$\beta=1/(k_{\text{B}} T)$$ is the inverse temperature (modulo the Boltzmann constant if you use "unnatural units"), and the $\mu_j$ are chemical potentials for conserved charges $Q_j$ of the system (if there are no conserved charges, there are no chemical potentials in the equilibrium distribution).

3. Jun 1, 2017

### A. Neumaier

Nobody but you calls this the Born rule. The true Born rule rather relates measurement results and eigenvalues.

4. Jun 1, 2017

### Blue Scallop

Note this is a B level thread so we need more words and conceptual illustrations. I got the idea qm-thermodynamics idea from the September 2004 issue of Scientific American in the article "Was Einstein Right".. Here's the excerpt for the context:

"AN ANALOGY is to Brownian motion. The jiggling of dust
motes looks random, but as Einstein himself demonstrated, it
is caused by unseen molecules following classical laws. In fact,
this analogy is tantalizingly tight. The equations of quantum
mechanics bear an uncanny resemblance to those of the kinetic
theory of molecules and, more generally, statistical mechanics.
In some formulations, Planck’s constant, the basic
parameter of quantum theory, plays the mathematical role of
temperature. It is as though quantum mechanics describes
some kind of gas or ensemble of “molecules”—a chaotic soup
of more primitive entities."
<one paragraph skipped>
"Over the past five years, though, hidden variables have
come back from the dead, thanks largely to Gerard ’t Hooft of
the University of Utrecht in the Netherlands, a Nobel laureate
quantum mechanician known for toying with radical hypotheses.
He argues that the salient difference between quantum
and classical mechanics is information loss. A classical
because classical variables can take on any value, whereas
quantum ones are discrete. So for a classical system to give rise
to a quantum one, it must lose information. And that can happen
naturally because of friction or other dissipative forces.

If you throw two pennies off the Empire State Building at
different speeds, air friction causes them to approach the
same terminal velocity. A person standing on the sidewalk
below can scarcely tell the precise velocity at which you threw
the pennies; that information is a hidden variable. In this situation
and many others, a wide range of starting conditions
lead to the same long-term behavior, known as an attractor.
Attractors are discrete—just like quantum states. The laws
they obey derive from, but differ from, Newton’s laws. In
fact, ’t Hooft asserts, the derived laws are none other than
quantum mechanics. Therefore, nature could be classical at
its most detailed level yet look quantum-mechanical because
of dissipation. “You’d have quantum mechanics as a low-energy
limit of some fundamental theory,” says Massimo Blasone
of the University of Salerno in Italy."

PF fellas. I'd like to know.. what classical systems have regions where there is more probability of occurences akin to Born rule.. I need verbal descriptions and illustrations rather than dense math that only Neumaier can understand....

5. Jun 2, 2017

### vanhees71

One of the advantages to use the statistical operator as representants of the state is that it is basis independent. You can evaluate the trace in any basis you want The above given rule implies that the probality to find the value $a$ of the observable $A$, which is an eigenvalue of the corresponding representing operator $\hat{A}$, is given by
$$P(a)=\mathrm{Tr} (\hat{P}_a \hat{\rho}),$$
where
$$\hat{P}_a=\sum_{\beta} |a,\beta \rangle \langle a,\beta|$$
is the projector to the eigenspace of $\hat{A}$ to the eigenvalue $a$, and $|a,\beta \rangle$ are a orthonormal set of vectors spanning this eigenspace. Using the eigenbasis you indeed get Born's rule in the sense you seem to have in mind,
$$P(a)=\sum_{\beta} \langle a,\beta|\hat{\rho}|a,\beta \rangle.$$
The special case for a pure state follows of course too since then by definition there's a normalized vector $|\psi \rangle$ such that $\hat{\rho}=|\psi \rangle \langle \psi|$,
$$P(a)=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2.$$
This is Born's rule in the original sense for pure states, giving the one and only interpretation of the meaning of Schrödinger's "wave function" that's compatible with observations.

6. Jun 2, 2017

### Blue Scallop

In the double slit experiment, the born rule says there is more probability of the electron hits at the constructive interference pattern area.

Can you give an example of a system in thermodynamics where this region of higher and lower probability can occur (akin to the interference patterns in the double slit experiment)?

7. Jun 2, 2017

### Prathyush

The two are not exactly equivalent, what we call Born rule refers to the second one, i.e probabilities of individual measurements. In the context of measurement of microscopic systems we derive first from the second. The second one is clearly a more precise statement in the context of microscopic measurements.

I am yet to sudy the statement that $\langle A \rangle = \mathrm{Tr} (\hat{\rho} \hat{A})$, contains everything we need to understand about macroscopic observables. In specific I want to study its domain of validity and its contrast with measurement of individual microscopic quantities.

Last edited: Jun 2, 2017
8. Jun 2, 2017

### vanhees71

Where is here a "contrast"? The expectation value can be measured only on an ensemble of stochastically independently prepared systems (prepared in the state described by $\hat{\rho}$ of course). About the individual measurement QT only tells you about the probabilities to measure each of the possible values of the measured observable, and again you can verify it only on an ensemble.

9. Jun 2, 2017

### Prathyush

The contrast lies in how we interact with the system. In the case of macroscopic system, we interact with system using coarse grained probes, therefore a thermodynamic description is applicable, and what we are measuring is indeed <A> and not statistics obtained over an ensemble of measurements. One can completely imagine a situation(agreed it is hard to construct when the degrees of freedom reach 10^23), where we consider an apparatus intended to measure in that case we have to apply the second rule because we are infact talking about probabilities of individual measurement and the second rule applies.

This is why macroscopic measurements and microscopic measurements are in essence different. I am reasonably confident that they can be harmonized, and it would lead to a deep clarificaiton on foundations of quantum mechanics and statistical mechanics. I also think that that physicists are only beginning to understand this problem.

In our discussions we never mentioned the Landauer's principle, which is currently(in my understanding) only a heuristic. In same sense in which we understand carnot's engine through the framework of statistical mechanics, we should be able to understand Landauer's principle.

10. Jun 2, 2017

### vanhees71

The reason why for macroscopic systems it appears as if we "measure $\langle A \rangle$" almost with certainty is that it is a coarse-grained quantity as you say, and the fluctuations around this expectation value of the coarse-grained quantity are quite small. If you are at or close to equilibrium it's indeed a thermodynamical average, i.e., you can get the expectation value with the canonical or grand-canonical ensembles with good accuracy, and you can estimate the uncertainty by the standard deviation. If you coarse grain "coarse enough", the fluctuations (standard deviations) are small compared to the measured value and/or the accuracy of the measurement apparatus.

11. Jun 2, 2017

### RockyMarciano

But you are assuming the coarse graining is a reversible process, right?

12. Jun 2, 2017

### vanhees71

How can it be a reversible process? I through away a vast amount of information, i.e., I don't look at most of the microcopic details, and I thus cannot expect to gain too much information about these details.

13. Jun 2, 2017

### RockyMarciano

Coarse-graining is in this general sense irreversible but you are applying it to QM, in the ensemble interpretation and in the limit where the standard deviations approach the expected value(i.e. in the limit where those microscopic details don't contain additional information).

14. Jun 2, 2017

### Prathyush

There are 2 types of uncertainties here which must be formally distinguished. The kind you are talking about is the same as in classical statistical mechanics which can be understood as an application of the central limit theorem.

The second kind of uncertainty is the fundamental uncertainty that comes in quantum mechanics, when we talk about measurements of individual particles. This is logically independent of the first one. Born's rule talk's about the second kind of uncertainty.

One point I want to emphasize is that the word coarse not only refers to type of system under investigation, but also the type of probe we use to analyze the system. The reason our theories work well is, we are in a domain of experience where we usually don't see the nuances of what it means to be coarse grained. For instance, our theory of superconductivity works extremely well precisely because of this kind of limiting process. We work in limit in which the method used to observe a macroscopic system does not appreciably affect the system, because the interaction can be neglected compared to other scales in the problem.

Bohr emphasized this point often, that the interaction of the system with the apparatus is uncontrollable.

As one makes observation about macroscopic systems more precise using a more detailed(or high energy) probe etc, the interaction of the probe with system will become important. Consider density operator(~$\phi^\dagger \phi$) for instance, when we use the density operator usually we assume that it highly coarse grained and we can conveniently use a fluid dynamic approximation. However if we use a probe that is detailed(high energy), we will appreciably effect the system under investigation and we have to use a microscopic description to study this system.

This is an elaboration on the contrast between the two kinds of systems, and also hopefully a clarification on how macroscopic descriptions and microscopic descriptions must be used.

15. Jun 2, 2017

### Demystifier

The Born rule is about the probability $p$, not the average value. So the general Born rule is actually
$$p = \mathrm{Tr} (\rho \pi),$$
where $\pi=\pi^2$ is a projector.

16. Jun 2, 2017

### vanhees71

Ok, I've no problem with that. Isn't this implied by what I wrote down? In a sense the projector $\hat{\pi}$ is also a special observable, measuring, whether the system is prepared such that $A$ takes the value $a$ (of course, again, $\hat{\pi}=\sum_{\beta} |a,\beta \rangle \langle a,\beta|$). As a projector it can take the eigenvalues 0 or 1, i.e., if it is determined due to the state ths system is prepared in, it answers the question, whether $A$ takes the value $a$ or not by giving the values 1 or 0 respectively. If the value is indetermined, $P(a)=\mathrm{Tr} \hat{\rho} \hat{\pi}\in [0,1]$ is the probability to give the value $a$.

17. Jun 2, 2017

### Prathyush

Quantum mechanics is a fundamental theory. There are many people who attempt to explain in terms of classical mechanics, but they have failed so far. I don't find their motives to be sound. They are unconformable with notion that nature can be described anything probabilistic, So they do weird things to make it appear like quantum mechanical uncertainty is classical noise.

In regard to the analogy to Brownian motion, It is well known, that the heat equation, looks like the Schroedinger equation upon wick rotation. However it does not imply that quantum mechanical uncertainty arises because of an underlying classical picture.

I have a friend who believes, in my opinion for good reasons, that this analogy between statistical mechanics and quantum mechanics has deep implications. See https://en.wikipedia.org/wiki/Wick_rotation

18. Jun 5, 2017

### A. Neumaier

But only for the methods, not for the interpretation.

19. Jun 7, 2017

### Prathyush

Well, he thinks that it will play an important role in understanding quantum gravity. The fact that black hole entropy and temperature can be derived from analytic continuation is certainly worth investigating.

20. Jun 8, 2017

### Blue Scallop

Demystifier,

as one who has thought about MWI inside out. I have a question. Unitarity occurs in the interpretation where the wave function squared is the probability amplitudes and it's related to born rule. But in MWI where Born rule is difficult to be integrated.. does it mean unitarity is also in serious trouble in MWI? Or is unitarity not directly related to the born rule at all?