Quantum probability versus entropy

In summary, the conversation discusses the relation between quantum probability and entropy, specifically in the context of excited molecules and quantum selection rules. There is a definition of entropy for quantum systems when considering mixed states, but there is no well-accepted statistical foundation for quantum mechanics. The conversation also touches on the concept of density matrices and their role in describing the state of a system. Ultimately, there are different views on the significance of density matrices in relation to spinor states, but this does not affect calculations.
  • #1
Tsunami
91
0
This is just an intuitive question, sparked by studying my courses. I haven't got the time to elaborate or search much myself at the moment, my apologies for this.

I was wondering if there was a relation between quantum probability and entropy. Naive formulation of entropy is that a system will tend to its most probable state. This is the state lowest in energy.

When a molecule gets excited to a higher energy level, this law continues to hold generally. Hence, the excited state soon falls back to its ground state, and emitting radiation when doing this.

However, sometimes this process is forbidden by quantum selection rules. When this happens, the excited molecule resists deactivation by radiation. Lifetimes of excited molecules then can become as long as several hours.

So, at least for a short while, the naive formulation of entropy seems to be challenged.

There does seem to be a limit for this: see for instance the Treanor effect, which gives a similar mad pumping of energy to the highest excited state, until a certain level, after which relaxation occurs.
 
Physics news on Phys.org
  • #2
This is another cool question.

You probably already know that if you get away from pure states, and begin considering mixed states, there is a definition of entropy for quantum systems. Here's a reference:
http://en.wikipedia.org/wiki/Density_matrix

One of the mysteries of QM / QFT is that statistical mechanics is always just a heartbeat away (for example, in analogies to critical exponents), but there is no well accepted statistical foundation for QM. And the statistical partition function [tex]e^{-\beta H}[/tex] is related to the functional integral method [tex]e^{-iHt}[/tex]. One of the problems in connecting these ideas is that they seem to imply a Euclidean version of space time. I think that Zee's textbook covers this relationship in detail, but I can't find my copy at the moment (too many books). That is, it's related to the Wick rotations. This could be explained by the fact that the exponential function is important in all of mathematics. Or it could be something fundamental.

One of the problems in physics is that of guessing what is fundamental, and what is mathematical convenience. The obvious statistical formulation for QM requires taking a whole bunch of physical concepts and demoting them to being mere mathematical conveniences, while at the same time taking a whole bunch of the methods physicists use to make calculations, and promoting them from mathematical conveniences to physical concepts.
 
  • #3
I have a question about the wikipedia link given above.

"This entropy can increase but never decrease[4][5] with a measurement. The entropy of a pure state is zero, while that of a proper mixture always greater than zero. Therefore a pure state may be converted into a mixture by a measurement, but a proper mixture can never be converted into a pure state. Thus the act of measurement induces a fundamental irreversible change on the density matrix; this is analogous to the "collapse" of the state vector, or wavefunction collapse."

It says that a proper mixture can never be converted to a pure state "by a measurement". As the original poster said, things like optical pumping and certain coherent control processes can move populations of an ensemble to a much less probable state, if not a "pure state". Am I missing something about the way they use the word "measurement" in the wikipedia reference?
 
  • #4
I don't think you're missing anything at all. I think the wikipedia article is just backwards and wrong when it says that measurement cannot decrease the entropy. Measurement always converts any state (high entropy or low) to a pure state.

Now let me go read it in context to see if I'm missing something... Hold on, let me get through this... Okay... Yes, I think I can explain the discrepancy.

When I say "density matrix" I'm thinking of a description of a single system. For me QM applies to single systems, which is part of the mystery of the stuff. For the author of that article, "density matrix" is a term that applies to an ensemble of states.

Suppose you had an ensemble of states all of which happened to be pure +z spin states (i.e. spin up). And you measured their spin in the x direction. What you'd get would be a mixture that would be 50% +x and 50% -x. That is, you would have converted a pure state to not only a mixed state, but a mixed state with maximum entropy. This is what the author is referring to.

One of my many little heresies with respect to QM is that I think that the density matrix states are fundamental, not the spinor states. So to me, the density matrix itself is sufficient to completely describe the state. So long as you stick to pure states you can always convert back and forth between spinors and density matrices. But mixed states don't work like that. A state which is half n half spin up/down, cannot be distinguished in the density matrix from a state which is half n half spin left/right because both use the same density matrix:

[tex]\left(\begin{array}{cc}0.5&0\\0&0.5\end{array}\right)[/tex]

As far as I know, if you have a population of spin-1/2 particles that are modeled with the above density matrix, you cannot figure out if they got that way as a result of an x or a z measurement. This is contained in the statement that all the information about the state is contained in the density matrix. Now to a person who is a density matrix formalism believer, like me, this is just what you expect. But to a person who is a spinor formalism believer, like most everybody else, this means that there is secret (hidden variables) information that has been destroyed in the density matrix description.

These two different views don't mean any difference when it comes to calculating, it is just a matter of taste, or faith. But in the density matrix formalism, it makes sense to talk about the entropy of a single quantum state (I've learned to call these things "qubits", a term that did not exist when I learned this subject). In the spinor formalism, the density matrix itself cannot describe a single quantum object, so for them, to say the word "density matrix" means that they HAVE to be talking about an ensemble.

So now I'm satisfied with what they mean when they say "measurement cannot decrease entropy". But it's not that I like it.

As it turns out, I am a great fan of Julian Schwinger's "measurement algebra", which cuts to the heart of this issue. Schwinger works in pure states only, (actually, this is not quite true, but it is close enough), and his measurement algebra corresponds to the operators of the Stern-Gerlach experiment.

And the operation of a Stern-Gerlach experiment is precisely what is required to reduce the entropy in a beam of atoms -- the apparatus splits the beam into two beams, each of which is pure. You could then recombine the beams with the appropriate alteration on one path, and by that method turn the beam of atoms from a mixed state back to a pure one. This is a reversal of entropy, but to make it happen, of course you have to increase entropy in the apparatus or its power supply, etc.

So I think the wikipedia article is confusing, but I have no doubt that if you look up their references you will find that they are good peer reviewed opinions and all that. On the other hand, you can also find references that will see things the way I do.

Carl

I suppose I should admit that I own two websites that discuss these matters: http://DensityMatrix.com and http://MeasurementAlgebra.com but that my opinions are only my own. If you are turning in homework to be graded by a professor, you'll find that you do best if you parrot his version of reality, no matter how sloppy it is. This is the best excuse for attending class I can think of.
 
Last edited by a moderator:
  • #6
Cool article, and I've only read up through the first paragraph.

Down to bottom of 2nd page. This is stunning. He's bringing in the density matrix to define the "relative" von Neumann entropy:
[tex]H = \int dq\;\;\rho\;\ln(\rho/|\psi|^2),[/tex]

where "q" defines the states to integrate over. Of course I'm a big fanatic on density matrices.

Now I've reached page 5. A very well written and easy to understand paper. As is natural for Bohmian mechanics, there's a concentration on velocity vector fields. I've been working in the internal symmetries of particles for so long that I'd forgotten that this is an attribute of Bohmian mechanics. As it turns out, I also believe that velocities are more important than momenta, and my book on the density operator formalism for the internal symmetries of the standard model deals with velocity eigenstates rather than momentum eigenstates. Thanks for the reference, I'll add citations to this. And 20 pages still to read...

I'm getting into the numerical part. Nice drawings. Oh, this is beautiful, the drawings of the wanderings of the orbit, the divergence of neighboring trajectories...

Halfway through. The numerical calculations must have been heroic for this. No wonder they stopped at 2D instead of 3D. Truly a remarkable paper. Where it differs from stuff I've attempted along this line is that they chose an initial pilot wave that was not a pure state for energy. It has me thinking of many things. There must be orbits that are cyclic, and neighborhoods that have transitions to chaotic behavior. Or will all that get wiped out in the coarse graining? I just realized that this all happens because the pilot wave automatically causes a space filling orbit to spend the right amount of time in any given region by defining the velocities of the particles in that neighborhood. Since the velocities are right, the densities have to be right too.

Oh man, this really does explain what is going on in why the entropy equation works the way it does.

And it ends with a long list of references. And now I've got some more researchers to read all the papers of. Here's quite a gift, a 500+ page book on Solvay:
http://arxiv.org/abs/quant-ph/0609184
 
Last edited:
  • #7
Another great article from this group of people (more or less) discusses a subtle distinction between the usual definition of boson versus fermion that I agree wholeheartedly with:

Remarks on identical particles in de Broglie-Bohm theory
Harvey R. Brown (Oxford), Erik Sjoeqvist (Uppsala), Guido Bacciagaluppi (Oxford), Phys.Lett. A251 (1999) 229-235
It is argued that the topological approach to the (anti-)symmetrisation condition for the quantum state of a collection of identical particles, defined in the `reduced' configuration space, is particularly natural from the perspective of de Broglie-Bohm pilot-wave theory.
...
The difference between the case of bosons and that of fermions lies not in a non-zero probability for coincidence of bosons (given that the set of coincidences has [tex]|\psi|^2[/tex]-measure zero irrespective of whether or not the wave function vanishes at the coincidences); it lies rather in the fact that while at a node the phase of the wave function is ill-defined, and thus the de Broglie–Bohm dynamics breaks down for coincident fermions, the trajectories of coincident bosons, if one should wish to retain them, would be well-defined for all times.
http://arxiv.org/abs/quant-ph/9811054

Another way of saying the same thing is that in Bohmian mechanics, one can argue that the intrinsic difference between bosons and fermions is that fermions have a force that tends to separate the particles, while bosons do not. Then the tendency of bosons to congregate becomes just a consequence of [tex]P = |\psi|^2[/tex], that is, it is nothing more than the essential nonlinearity between the wave function and the probability density. This makes me think that if you write down a bosonic theory in density matrix form, the nonlinearity will disappear and the bosons will no longer be seen as tending to clump (in position space, of course). Or have I gone one over my limit for reading papers this morning?
 
  • #8
Check out Landau and Lifschitz's book and Feynman's book on Statistical Mechanics.
Regards,
Reilly Atkinson
 

1. What is quantum probability?

Quantum probability is a mathematical framework used to describe the probability of outcomes in quantum systems. It takes into account the inherent randomness and uncertainty of quantum mechanics, which is fundamental to the nature of quantum particles.

2. How does quantum probability differ from classical probability?

Classical probability is based on deterministic systems, where the outcome of an event can be predicted with certainty. In contrast, quantum probability deals with probabilistic outcomes due to the inherent uncertainty in quantum mechanics.

3. What is entropy in the context of quantum mechanics?

In quantum mechanics, entropy is a measure of the disorder or uncertainty in a system. It is related to the number of possible states that a system can be in, and it increases as the system becomes more disordered or unpredictable.

4. How are quantum probability and entropy related?

Quantum probability and entropy are closely related, as quantum mechanics deals with probabilistic outcomes and increasing entropy. In fact, the concept of entropy is used in quantum information theory to quantify the amount of uncertainty in a quantum system.

5. What are some real-world applications of quantum probability and entropy?

Quantum probability and entropy have a wide range of applications in fields such as quantum computing, cryptography, and quantum information processing. They are also used in understanding the behavior of complex systems, such as in biology, economics, and social sciences.

Similar threads

  • Quantum Physics
Replies
1
Views
581
  • Quantum Physics
Replies
15
Views
2K
Replies
16
Views
1K
Replies
3
Views
833
  • Quantum Physics
Replies
1
Views
697
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
2
Views
1K
Replies
46
Views
2K
  • Quantum Physics
Replies
2
Views
751
Back
Top