Entropy in isolated quantum systems

Click For Summary
Loschmidt's paradox raises questions about entropy in isolated quantum systems, where unitary time evolution suggests that von Neumann entropy remains constant. This contrasts with classical systems, where entropy can increase due to mixing. The discussion highlights the role of entanglement, noting that while the entropy of subsystems may increase, the total entropy of the entire system remains conserved. Participants debate the implications of isolated systems and the nature of time asymmetry, questioning whether a universal arrow of time exists without interactions with the environment. The conversation emphasizes the need for a deeper understanding of quantum mechanics and its relationship to classical statistical mechanics.
  • #31
A. Neumaier said:
However, the universe must have a state. For if not, there would have to be a limit on the size of a system to be described by a state. This size would be completely arbitrary.

I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.
 
Physics news on Phys.org
  • #32
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

In the Copenhagen spirit, a classical system is not an observer. The wave function is a device to encapsulate knowledge gained from previous measurements. A classical system has no "knowledge" unless it is a human or machine making QM calculations.
 
  • #33
A. Neumaier said:
In Boltzmann's analysis the environment remains hidden, but acts by restoring the independence assumption at _all_ times.

Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions. The total entropy is equal to the marginal entropy (the entropy resulting from assuming independence at all times) plus the mutual entropy (the entropy arising from the correlations). The total entropy is constant, the mutual entropy is identified with the thermodynamic entropy, and always increases.
 
  • #34
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

What defines an observer, in physical terms? This is an ill-defined notion.

The advantage of my thermal interpretation (see https://www.physicsforums.com/showthread.php?t=490492 ) is that one doesn't need such il--defined terms to make sense of quantum mechanics. Classical systems are simply quantum mechanical systems in which all observables of interest are almost certain in a well-defined sense. And an observer is nowhere needed since the whole physics is observer-independent.
 
  • #35
Rap said:
Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions.
And ignoring this is the reason for the increase of entropy.
 
  • #36
SpectraCat said:
Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...

If the system is isolated in the sense that all interactions are described by a single basis (that is, nothing is "traced out"; another way to think of this is that everything is evolving under the same Hamiltonian) then there is no "randomizing" and all evolution is unitary (reversible). What you then have is COHERENT DEPHASING of the wave packet made up of vibrational states, not the flow of energy from vibrational modes into the environment that is usually meant when IVR is discussed. The idea that the likelihood of ALL of the energy making its way back to the original mode seems "vanishingly small" might feel correct intuitively because the actual time that it takes for the wave packet to get to a total revival (reacquire exactly its initial state) is proportional to the number of modes making up the wave packet. For rotational wave packets, for example, I believe it goes as the square of the number (I'm not sure about vibrations). Therefore, the larger the system is the LONGER it will take for the revival to occur, but if it's actually isolated in the sense discussed above, it WILL happen and hence, there is no increase in entropy.

As someone else mentioned, another way to think about this is that under unitary evolution the eigenvalue spectrum of the density matrix is preserved at all times. Therefore, in some basis it will always be diagonal with the same "populations", and all that's happening is that the basis set that the system is diagonal in will rotate in time.
 
  • #37
Einstein Mcfly said:
If the system is isolated in the sense that all interactions are described by a single basis (that is, nothing is "traced out"; another way to think of this is that everything is evolving under the same Hamiltonian) then there is no "randomizing" and all evolution is unitary (reversible). What you then have is COHERENT DEPHASING of the wave packet made up of vibrational states, not the flow of energy from vibrational modes into the environment that is usually meant when IVR is discussed. The idea that the likelihood of ALL of the energy making its way back to the original mode seems "vanishingly small" might feel correct intuitively because the actual time that it takes for the wave packet to get to a total revival (reacquire exactly its initial state) is proportional to the number of modes making up the wave packet. For rotational wave packets, for example, I believe it goes as the square of the number (I'm not sure about vibrations). Therefore, the larger the system is the LONGER it will take for the revival to occur, but if it's actually isolated in the sense discussed above, it WILL happen and hence, there is no increase in entropy.

As someone else mentioned, another way to think about this is that under unitary evolution the eigenvalue spectrum of the density matrix is preserved at all times. Therefore, in some basis it will always be diagonal with the same "populations", and all that's happening is that the basis set that the system is diagonal in will rotate in time.

Thanks for the detailed response ... I need to take some time to consider what you have said, and check it against the experimental results from the literature that I was thinking of when I constructed this example. In the meantime, there is just one picky point of order ... in chemical physics, which is the context my example was cast in, IVR does not involve coupling to an environment .. it is not vibrational relaxation. IVR means intramolecular vibrational redistribution ... i.e. the energy stays in the molecule but is redistributed over time throughout the vibrational modes of the molecule.

IVR is not typically described as a coherent process, but that may just be because the recurrence times for molecules of any meaningful size would be orders of magnitude longer than the spontaneous emission rates for IR photons. I know Brooks Pate at Virgina has looked in detail at these kinds of IVR processes, but I haven't looked at those papers for many years. I will review them and see if they agree with your analysis. It does make sense to me that, if the IVR process is just a coherent dephasing, then provided the system does not interact with anything else on the time-scale of the experiment, the von-Neumann entropy of the system should remain constant. I think your reply also makes it clear that my concerns about symmetry degeneracies of the normal modes were something of a red-herring. If the system stays coherent, then by definition it is described by unitary time evolution, so those details will just "work themselves out".
 
  • #38
A. Neumaier said:
And ignoring this (correlations) is the reason for the increase of entropy.

Yes, because the total entropy is constant (due to time reversal symmetry) and the correlation entropy (mutual entropy) is decreasing.
 
  • #39
SpectraCat said:
Thanks for the detailed response ... I need to take some time to consider what you have said, and check it against the experimental results from the literature that I was thinking of when I constructed this example. In the meantime, there is just one picky point of order ... in chemical physics, which is the context my example was cast in, IVR does not involve coupling to an environment .. it is not vibrational relaxation. IVR means intramolecular vibrational redistribution ... i.e. the energy stays in the molecule but is redistributed over time throughout the vibrational modes of the molecule.

IVR is not typically described as a coherent process, but that may just be because the recurrence times for molecules of any meaningful size would be orders of magnitude longer than the spontaneous emission rates for IR photons. I know Brooks Pate at Virgina has looked in detail at these kinds of IVR processes, but I haven't looked at those papers for many years. I will review them and see if they agree with your analysis. It does make sense to me that, if the IVR process is just a coherent dephasing, then provided the system does not interact with anything else on the time-scale of the experiment, the von-Neumann entropy of the system should remain constant. I think your reply also makes it clear that my concerns about symmetry degeneracies of the normal modes were something of a red-herring. If the system stays coherent, then by definition it is described by unitary time evolution, so those details will just "work themselves out".

Of course you're right, IVR doesn't necessarily involve coupling to the environment, however as I described above, if it's not coupled to the environment in any way then it MUST be described as a coherent process (at least if you're interested in the dynamics at long times). The coherence in these dynamics is the key to coherent control process that are being done all over the place using electronic states (Rydberg wave packets), vibrational and rotational wave packets (molecular alignment), including by Brooks Pate. From his website:

"A major emphasis of our work is understanding the spectroscopy of molecules as the IVR process, and possibly reaction, occurs. In particular, we are interested in how coherent excitation of highly excited molecules can be used to influence reaction products."

It's neat stuff, as is the inclusion of dissipative process for larger systems (like proteins and such). Dwayne Miller's group up in Toronto has done some fascinating experiments that show coherent processes in rhodopsin in retinal, as well as Graham Flemings group in Berkley and Greg Engel's group at U of Chicago to name a few.
 
  • #40
A. Neumaier said:
To understand things in an extremely simplified but appropriate setting, consider the Lorenz attractor - it is a time-reversal invariant ordinary differential equation in 3 variables, and nevertheless shows an increase of entropy for every initial ensemble close to the attractor, no matter whether you run the dynamics forward or backward. You can easily program the Lorenz attractor yourself in systems like Matlab or Mathematica, and convince you of the fact that once approximations are made, irreversibility ''follows'' easily from time-reversibility plus chaoticity.

Actually, the Lorenz system is dissipative, and running it backwords gives nonsense. Thus one needs to take a chaotic Hamiltonian system, such as the double pendulum (for values of the energy where this is chaotic).
http://en.wikipedia.org/wiki/Double_pendulum
 
  • #42
So, my current understanding is this:

1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

2) In deriving the H-Theorem, one has to neglect correlations. Since the joint entropy doesn't change, the increase in thermodynamical entropy is compensated by the increase of correlational information.

3) As far as Loschmidt's paradox is concerned, there's no fundamental difference between classical systems and quantum mechanical ones.

My next step is to read the papers provided by Demystifier.

[Also thanks to Einstein Mcfly for resolving the IVR problem. This system sounds interesting indeed. So I have been wrong in assuming that it is a relaxation process.]
 
  • #43
kith said:
So, my current understanding is this:

1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

2) In deriving the H-Theorem, one has to neglect correlations. Since the joint entropy doesn't change, the increase in thermodynamical entropy is compensated by the increase of correlational information.

3) As far as Loschmidt's paradox is concerned, there's no fundamental difference between classical systems and quantum mechanical ones.

Yes, that's it!

Moreover, the dynamics for which the H-theorem is derived is a corse-grained, approximate (though for macroscopic systems quite accurate) dynamics for the state of the system. An exact dynamics would have to keep an infinite memory, and the H-theorem would no longer be derivable.
 
  • #44
kith said:
So, my current understanding is this:
1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

Yes, but realize that you should say "marginal entropy", not "thermodynamical entropy". Joint entropy, marginal entropy, correlation information (or mutual information) are all information-theoretic entropies and are dimensionless. Thermodynamic entropy is marginal entropy times Boltzmann's constant. Boltzmann's constant has dimensions of thermodynamic entropy (e.g. Joule/kelvin).

Correct me if I am wrong, but I think the reason the marginal entropy is the only thermodynamically relevant entropy is because the internal energy of a substance is the thing that we can measure, and only the marginal entropy contributes to the energy. The correlations between the particles do not contribute to the internal energy or any other macroscopic thermodynamic variable. Does this sound correct?
 
  • #45
SpectraCat said:
Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

The density matrix is a positive Hermitian operator at all times, thus there is always a basis in which it is diagonal. As you said, if you fix a given basis the size of the off diagonal may vary, but there is always a basis where the density matrix is diagonal. If the evolution is unitary as I described above, then all that happens is that the diagonalizing basis rotates through Hilbert space. The eigenvalues remain fixed for all time.

There is a more general kind of evolution, called evolution via completely positive map, which not only alters the diagonalizing basis but can also changes the eigenvalues. This type of evolution can always be realized as unitary evolution on a larger system containing the system you're interested in. One regards the extended system as system+environment so that the "fine grained" entropy of the system of interest can change because of the coupling to the environment.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...

If the molecular system in question is well approximated by an isolated finite state system, then there will always be recurrences where after a finite time the state comes very close to the initial state. As one makes the system larger and larger these recurrences take longer and longer to occur.

Your argument about the pure state is exactly right. The entire system is in a pure state and evolves unitarily and thus the entropy of the full system is zero at all times. However, entropies of subsystems can certainly change in time. With a sufficiently complex molecule the dynamics of an effective subsystem could appear to relax to a finite entropy state for long periods. For example, suppose one begins with a pure state consisting of an electronic excitation unentangled with the vibrational degrees of freedom. We have S_{e} = 0 (the entropy of the electronic subsystem). As the full system evolves, the electronic degrees of freedom will become entangled with the vibrational degrees of freedom. This can easily result in the appearance of a non-zero S_e for long periods of time.

Hope this helps.
 
  • #46
Rap said:
Yes, but realize that you should say "marginal entropy", not "thermodynamical entropy". Joint entropy, marginal entropy, correlation information (or mutual information) are all information-theoretic entropies and are dimensionless. Thermodynamic entropy is marginal entropy times Boltzmann's constant. Boltzmann's constant has dimensions of thermodynamic entropy (e.g. Joule/kelvin).
In an information theoretic context, one can always set Boltzmann's constant to 1 or ln 2, by choosing the Kelvin appropriately. Thus it is appropriate to equate the two.entropies. (This is like choosing units such that c=1 in relativity or hbar=1 in quantum mechanics.)
Rap said:
Correct me if I am wrong, but I think the reason the marginal entropy is the only thermodynamically relevant entropy is because the internal energy of a substance is the thing that we can measure, and only the marginal entropy contributes to the energy. The correlations between the particles do not contribute to the internal energy or any other macroscopic thermodynamic variable. Does this sound correct?
One can measure various things, but they are all related to the thermodynamical entropy by the standard formalism.
 
  • #47
A. Neumaier said:
In an information theoretic context, one can always set Boltzmann's constant to 1 or ln 2, by choosing the Kelvin appropriately. Thus it is appropriate to equate the two.entropies. (This is like choosing units such that c=1 in relativity or hbar=1 in quantum mechanics.)

Yes. I just wanted to point out that the usual thermodynamic entropy is not identical to the marginal entropy. The marginal entropy is dimensionless, the usual thermodynamic entropy has units of e.g. Joule/Kelvin. I actually prefer to think of temperature as having units of energy, multiplying the change in marginal information entropy to yield the change in internal energy at constant volume and constant particle number.
 

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
4
Views
307
  • · Replies 39 ·
2
Replies
39
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
48
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K