Entropy in isolated quantum systems

kith
Science Advisor
Messages
1,437
Reaction score
535
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
 
Last edited:
Physics news on Phys.org
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?

Hmmm .. I am not sure why you say unitary time-evolution implies that the von-Neumann entropy is constant for an isolated quantum system at all times. Consider the case of a molecule whose highest-energy normal mode has been excited by a photon. At t=0, the energy from the photon is localized as one quantum of excitation in a single vibrational mode (pure state, zero von-Neumann entropy). Over time, intramolecular vibrational redistribution (IVR) caused by the anharmonic couplings of other normal modes to the initially excited one, will cause that excitation energy to become "randomly" distributed (colloquial use only here .. I doubt that it is really a stochastic process, particularly since we are already in the context of microscopic reversibility) over all the modes in the molecule (mixed state, non-zero von-Neumann entropy).

Have I somehow misunderstood something in the above example?
 
SpectraCat said:
Hmmm .. I am not sure why you say unitary time-evolution implies that the von-Neumann entropy is constant for an isolated quantum system at all times.
Isolated implies unitarian time evolution. The von-Neumann-entropy depends only on the eigenvalues of the density matrix and these are not changed under any unitarian transformation.

Your example is a relaxation process; such processes usually can't be described by unitarian time evolution. As environment, you have at least the electromagnetic field present.

[Typical environments are much larger than the system of interest, so the impact of a change in the system on the environment can be neglected. This leads to an effective time evolution of the system density matrix which is not unitarian (see the http://en.wikipedia.org/wiki/Lindblad_equation" for example).]
 
Last edited by a moderator:
In discussions about the classical, thermodynamical arrow of time, many people didn't consider Loschmidt's paradox paradox. So what about the quantum case? Maybe I should have chosen a more spectacular title like "the quantum arrow of time" to get more comments. ;)
 
Last edited:
kith said:
Isolated implies unitarian time evolution. The von-Neumann-entropy depends only on the eigenvalues of the density matrix and these are not changed under any unitarian transformation.

Your example is a relaxation process; such processes usually can't be described by unitarian time evolution. As environment, you have at least the electromagnetic field present.

[Typical environments are much larger than the system of interest, so the impact of a change in the system on the environment can be neglected. This leads to an effective time evolution of the system density matrix which is not unitarian (see the http://en.wikipedia.org/wiki/Lindblad_equation" for example).]

Ok ... I was not trying to create an example of a relaxation process. I was only trying to fix the total internal energy of the isolated quantum system, and have all of the energy start out in a single mode. It seems to me that particular starting state (all of the energy localized in a single mode), ought to have a lower entropy than a state where the energy is arbitrarily distributed over several modes of the molecule, since the latter case will likely have several degeneracies (assuming the density of states is sufficiently high). The IVR process by which the vibrational energy becomes redistributed over the various modes is well-understood and should be describable by unitary time evolution. In terms of the density matrix, it will simply evolve with time, as the populations of the different states change. Based on the definition of the von Neumann entropy:

S=k_B Tr[\rho ln\rho]

it seems like it would only stay constant if the normal modes of the molecule were strictly orthogonal so that the density matrix remained diagonal at all times. However, since the IVR process proceeds through cross-anharmonicities that couple the nominally orthogonal normal modes, this will not be the case.

Note that I am not that familiar with the ins and outs of density matrix formulations, or of von Neumann entropy, so it is entirely possible that I have made a mistake in the above analysis. If I have, I would be happy to learn more.
 
Last edited by a moderator:
kith said:
In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times.

I don't understand von Neumann entropy very well, but I do know that if you have a particle described by a gaussian wavefunction (momentum and position wave functions are Fourier transforms of each other), the sum of the information entropies of position and momentum increases in time.
 
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
First, the Loschmidt's paradox is a paradox in classical statistical mechanics, not quantum mechanics.

Second, you are right that unitary evolution of the whole system conserves entropy.

Third, when a quantum subsystem is entangled to another subsystem, then entropy of each subsystem may increase with time. If you see a paradox with it, note that the entropy of the whole system is not the sum of entanglement entropies of its subsystems.
 
kith said:
I'm still puzzled by Loschmidt's paradox.



Check out http://en.wikipedia.org/wiki/Mutual_information#Applications_of_mutual_information

Basically, from an information-theoretic point of view, in the classical case, there are two entropies, the marginal entropy that you get from assuming the particle energies are uncorrelated, and the mutual entropy that you get from correlations of the energies among the particles. Liouville's theorem says that their sum remains constant. If you start out with uncorrelated energies, all the entropy is in the marginal entropy. As time goes by, collisions occur and the energies become correlated; marginal entropy increases, the mutual entropy decreases. The thermodynamic entropy is the marginal entropy - specifying a thermodynamic state gives you no information about correlations. So the thermodynamic entropy increases, while the total information entropy remains constant. When Boltzmann derived his H-theorem, this was what he was basically doing - assuming that the particle energies were forever uncorrelated, which gave an increase in entropy.
 
Maybe I should emphasize my main questions. I'm not sure if the answers lie entirely in quantum mechanics, maybe my knowledge of classical statistical mechanics needs some refreshing.

1) Is it possible, that a time asymmetry arises in an isolated quantum system? If not, this should mean that there is no universal arrow of time. Also no heat death of the universe. Shouldn't it?
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution. Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?

[Thanks for all replies so far, I need some time to answer them.]
 
  • #10
Rap said:
I don't understand von Neumann entropy very well, but I do know that if you have a particle described by a gaussian wavefunction (momentum and position wave functions are Fourier transforms of each other), the sum of the information entropies of position and momentum increases in time.
Just to understand you correctly: by the "information entropy of position" you mean the Shannon entropy of the probability distribution given by
|\Psi(\vec{r})|^2
(according to wikipedia, there are technical difficulties to extend the concept of the Shannon entropy to the continuous case but I guess that's not important for us now)

I have to think about this. Right now, I don't see were the time asymmetry in this broadening of wave packets comes from, since the time evolution is unitarian.

However this only remotely touches my initial question. The Von Neumann entropy S is a measure for the purity of states. An isolated wave packet remains a pure state at all times, so S=0 at all times.
 
  • #11
Demystifier said:
First, the Loschmidt's paradox is a paradox in classical statistical mechanics, not quantum mechanics.
Why not? There's the quantum mechanical H-Theorem which leads to the second law of thermodynamics.

Demystifier said:
Third, when a quantum subsystem is entangled to another subsystem, then entropy of each subsystem may increase with time. If you see a paradox with it, note that the entropy of the whole system is not the sum of entanglement entropies of its subsystems.
I had completely forgotten to consider that. Thanks for pointing it out!
 
Last edited:
  • #13
SpectraCat said:
Ok ... I was not trying to create an example of a relaxation process. I was only trying to fix the total internal energy of the isolated quantum system, and have all of the energy start out in a single mode. It seems to me that particular starting state (all of the energy localized in a single mode), ought to have a lower entropy than a state where the energy is arbitrarily distributed over several modes of the molecule, since the latter case will likely have several degeneracies (assuming the density of states is sufficiently high). The IVR process by which the vibrational energy becomes redistributed over the various modes is well-understood and should be describable by unitary time evolution. In terms of the density matrix, it will simply evolve with time, as the populations of the different states change. Based on the definition of the von Neumann entropy:

S=k_B Tr[\rho ln\rho]

it seems like it would only stay constant if the normal modes of the molecule were strictly orthogonal so that the density matrix remained diagonal at all times. However, since the IVR process proceeds through cross-anharmonicities that couple the nominally orthogonal normal modes, this will not be the case.

Note that I am not that familiar with the ins and outs of density matrix formulations, or of von Neumann entropy, so it is entirely possible that I have made a mistake in the above analysis. If I have, I would be happy to learn more.

The full density matrix of an isolated system evolves as \rho \rightarrow U \rho U^+. It follows that S = - \mbox{tr}(\rho \log{\rho}) is invariant. For example, the eigenvalues p_i of \rho and U \rho U^+ are identical and S = \sum_i - p_i \log{p_i} (just compute the trace by going to the diagonal basis).

Of course, as others here have mentioned, the entropy of a subsystem certain can increase even if the entire system is evolving unitarily.
 
  • #14
Physics Monkey said:
The full density matrix of an isolated system evolves as \rho \rightarrow U \rho U^+. It follows that S = - \mbox{tr}(\rho \log{\rho}) is invariant. For example, the eigenvalues p_i of \rho and U \rho U^+ are identical and S = \sum_i - p_i \log{p_i} (just compute the trace by going to the diagonal basis).

Of course, as others here have mentioned, the entropy of a subsystem certain can increase even if the entire system is evolving unitarily.

Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...
 
  • #15
kith said:
Just to understand you correctly: by the "information entropy of position" you mean the Shannon entropy of the probability distribution given by
|\Psi(\vec{r})|^2
(according to wikipedia, there are technical difficulties to extend the concept of the Shannon entropy to the continuous case but I guess that's not important for us now)

Yes. If \Psi_x(x,t) is the position wavefunction (one dimensional), and \Psi_p(p,t) is the momentum wavefunction, and the associated probabilities |\Psi_x(x,t)|^2 and |\Psi_p(p,t)|^2 are Gaussian, such that Heisenberg uncertainty holds exactly at time t=0, then the entropies are:
H_x = -\int_{-\infty}^\infty |\Psi_x(x,t)|^2 \ln(|\Psi_x(x,t)|^2)dx
H_p = -\int_{-\infty}^\infty |\Psi_p(p,t)|^2 \ln(|\Psi_p(p,t)|^2)dp
and (setting \hbar=1):
H_x+H_p = \ln(e\pi\sqrt{1+\tau^2})
where
\tau=\frac{t}{2m\sigma^2}
where \sigma^2 is the variance of the position Gaussian.
 
  • #16
kith said:
Why not? There's the quantum mechanical H-Theorem which leads to the second law of thermodynamics.
OK, I admit that there is also a quantum analogue of the Loschmidt paradox.
 
  • #17
kith said:
1) Is it possible, that a time asymmetry arises in an isolated quantum system?
Certainly yes. It simply means that there exists a solution of the Schrodinger equation psi(x,t) such that |psi(x,t)| is not equal to |psi(x,-t)|. In fact, most solutions are such.

kith said:
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution.
I don't think so. I think it is due to the initial condition, which, for some reason, was "ordered".

kith said:
Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?
Perhaps the best answer is provided by
http://arxiv.org/abs/1011.4173
This is a classical explanation, but the idea can easily be extended to the quantum case as well. Indeed, the classical explanation above has been partially motivated by a work in quantum mechanics
http://arxiv.org/abs/0802.0438 [Phys.Rev.Lett.103:080401,2009]

Note, in particular, (in the first paper above) that there are many inequivalent definitions of entropy. Some of them do not depend on time, while others do. So one must be very careful what one means by "entropy" when claims that entropy does or does not increase with time.
 
Last edited:
  • #18
kith said:
I'm still puzzled by Loschmidt's paradox.

In quantum mechanics, an isolated system has a unitarian time-evolution which implies that the (von-Neumann-) entropy remains constant at all times. (Contrary to the entropy-increase due to mixing in isolated classical systems for example)

So whenever the entropy of a system increases there must be other systems or an environment present to compensate this increase. For example, the entropy of the whole universe should not be allowed to increase.

Any Thoughts?
Systems observable by us are never isolated, so the entropy increases.

The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.
 
  • #19
kith said:
2) The arrow of time we perceive in our systems of interest, would ultimately be due to interactions with the environment which lead to non-unitarian time evolution. Why does this arrow always point in the same direction respectively why do typical environments always increase the entropy of the system?

Because the interaction with the environment consists of a huge number of small contributions of nearly independent tiny subsystems of the environment, so that the law of large numbers applies (as evidenced by the techniques used in nonequilibrium statistical mechanics).
 
  • #20
SpectraCat said:
However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant.
I'm not very familiar with molecular dynamics, but it usually goes like this: if you consider an isolated system like your molecule and you start in an eigenstate of the corresponding Hamiltonian, the system stays in this state forever, because all eigenstates are stationary states. From a physical point of view, this is counterintuitive because in real experiments, you always have the electromagnetic field present which gives you finite lifetimes and relaxation rates.
 
  • #21
A. Neumaier said:
The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.
Ok. So the picture of a heat death of the universe is wrong?

A. Neumaier said:
Because the interaction with the environment consists of a huge number of small contributions of nearly independent tiny subsystems of the environment, so that the law of large numbers applies (as evidenced by the techniques used in nonequilibrium statistical mechanics).
Can you elaborate this? I still don't get how the time symmetric interaction laws of these subsystems can lead to a time asymmetrical law like the H-Theorem, and how the law of large numbers explains this.
 
Last edited:
  • #22
For the other answers -especially to Rap and Demystifier- I need more time.
 
  • #23
kith said:
I'm not very familiar with molecular dynamics, but it usually goes like this: if you consider an isolated system like your molecule and you start in an eigenstate of the corresponding Hamiltonian, the system stays in this state forever, because all eigenstates are stationary states. From a physical point of view, this is counterintuitive because in real experiments, you always have the electromagnetic field present which gives you finite lifetimes and relaxation rates.

This is because real experiments are observed, and hence interact with the environment. Thus the unitary evolution applies only approximately, to the extent you can neglect the interaction.
 
  • #24
A. Neumaier said:
This is because real experiments are observed, and hence interact with the environment. Thus the unitary evolution applies only approximately, to the extent you can neglect the interaction.
There is no need of observation. The interaction with the electromagnetic field as environment is sufficient to lose the unitarity of time evolution.
 
  • #25
kith said:
Ok. So the picture of a heat death of the universe is wrong?
Yes. There is no hint at all that the universe should suffer a heat death. The story of the heat death came up by overinterpreting the second law, at a time when its connection to mechanics was not yet understood.

kith said:
Can you elaborate this? I still don't get how the time symmetric interaction laws of these subsystems can lead to a time asymmetrical law like the H-Theorem, and how the law of large numbers explains this.
Any derivation of the second law makes some form of assumption on the (mixed) initial state (and often also on each later state). If this assumption is satisfied then the evolution from this state into the future satisfies the second law. But so does the time-reversed evolution - i.e., going from that state into the past also increases the entropy. This shows that there is something artificial about this assumption.
The truth is that the assumption is only approximately satisfied at any time, and since the dynamics is chaotic, the uncertainty can have arbitrarily large consequences, but with arbitrarily small probability.

In Boltzmann's analysis the environment remains hidden, but acts by restoring the independence assumption at _all_ times. In the more refined version that works for more complex systems than ideal gases, one typically assumes that the state at some initial time is Gaussian (or, more technically, quasi-free), which is just of this kind. In this case, the environment consistsd of all the stuff that is not characterized by the comapatively few variables chosen to describe the macroscopic system, to which the Gaussian assumption is applied.

To understand things in an extremely simplified but appropriate setting, consider the Lorenz attractor - it is a time-reversal invariant ordinary differential equation in 3 variables, and nevertheless shows an increase of entropy for every initial ensemble close to the attractor, no matter whether you run the dynamics forward or backward. You can easily program the Lorenz attractor yourself in systems like Matlab or Mathematica, and convince you of the fact that once approximations are made, irreversibility ''follows'' easily from time-reversibility plus chaoticity.
 
  • #26
kith said:
There is no need of observation. The interaction with the electromagnetic field as environment is sufficient to lose the unitarity of time evolution.

This _is_ already observation. The system is observed once photons can leave the system, no matter whether some recording device such as a photographic plate or a human eye is there to receive the photons.
 
  • #27
A. Neumaier said:
This _is_ already observation. The system is observed once photons can leave the system, no matter whether some recording device such as a photographic plate or a human eye is there to receive the photons.

That statement seems to contradict the idea of entanglement, Bell's theorem, and the experiments by groups such as Aspect and Zeilinger. Those experiments show that, until one member of a photon pair has been destructively detected, the pair remains entangled. Furthermore, the entanglement seems to not be an exclusive feature of the original pair, but rather a property that can be transferred to other pairs of particles.

So from that point of view, there seems to be a definite distinction between the emission of a photon, and the detection of that photon.
 
  • #28
SpectraCat said:
That statement seems to contradict the idea of entanglement, Bell's theorem, and the experiments by groups such as Aspect and Zeilinger. Those experiments show that, until one member of a photon pair has been destructively detected, the pair remains entangled. Furthermore, the entanglement seems to not be an exclusive feature of the original pair, but rather a property that can be transferred to other pairs of particles.

So from that point of view, there seems to be a definite distinction between the emission of a photon, and the detection of that photon.

Of course there is such a distinction. But this doesn't contradict my statement.

Arguments about environment become meaningless if one changes the meaning of the terms ''system'' and ''environment'' during the argument. One therefore needs to specify beforehand what counts as the system and what counts as the environment, and then stick to that.

In Zeilinger's experiment, the system is the pair of entangled photons. The experimental arrangement ensures that nothing leaves this systerm until the measurement. (This is not easy. Without special precautions, the system decoheres long before it is measured. That's why entanglement experiemnts over large distances are difficult to perform.)

The context of my previous statement included the assumption that the system is coupled to the e/m environment (which therefore does not belong to the system). Thus photons leave _that_ system, which decoheres the latter.
 
  • #29
A. Neumaier said:
Systems observable by us are never isolated, so the entropy increases.

The universe as a whole is isolated - it is the _only_ isolated system containing us! Therefore its entropy is constant. But this has no observable consequences since there is no way to measure the total entropy of an inhomogeneous system of this size.

I would not dare to assign an entropy to the whole universe, neither a wavefunction etc.
I would even hesitate to call it isolated. Isolated means isolated from the effects of the surrounding. But there is no surrounding of the universe.
 
  • #30
DrDu said:
I would not dare to assign an entropy to the whole universe, neither a wavefunction etc.
I know that I am more daring, but with good grounds.

It is an undecidable querstion whether the state of the universe is pure or mixed, so, yes, one cannot necessarily assign to the universe a wave function, since this is possible only if the universe is in a pure state.

However, the universe must have a state. For if not, there would have to be a limit on the size of a system to be described by a state. This size would be completely arbitrary.

Statistical mechanics describes ordinary macroscopic matter very successfully by a mixed state, and the resulting hydrodynamic description seems to be an excellent model for the visible part of the universe. It would be very strange if such large systems are described by quantum mechanics (and hence have a state) but no such description would apply to even bigger systems - just because we cannot observe them. Cosmology would not make sense without allowing the universe to have a state.


DrDu said:
I would even hesitate to call it isolated. Isolated means isolated from the effects of the surrounding. But there is no surrounding of the universe.

Everything in the universe is coupled to its complement in the universe, hence not isolated - except if this complement is empty. Thus if the universe were not isolated then nothing is. Thus the term would be vacuous.

On the other hand, if we define the universe od an observer O as being the smallest isolated system containing O, it is conceivable that there are many universes - they just don't interact, so we cannot know anything about any universe except ours. If many universes existed, then their environment would not be empty but would consist of all other universes (from whom they are isolated).

In the spirit of Ockham's razor, we can however ignore all other universes and deny their existence, without _any_ loss of predictivity for the inhabitants of _our_ universe.
 
  • #31
A. Neumaier said:
However, the universe must have a state. For if not, there would have to be a limit on the size of a system to be described by a state. This size would be completely arbitrary.

I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.
 
  • #32
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

In the Copenhagen spirit, a classical system is not an observer. The wave function is a device to encapsulate knowledge gained from previous measurements. A classical system has no "knowledge" unless it is a human or machine making QM calculations.
 
  • #33
A. Neumaier said:
In Boltzmann's analysis the environment remains hidden, but acts by restoring the independence assumption at _all_ times.

Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions. The total entropy is equal to the marginal entropy (the entropy resulting from assuming independence at all times) plus the mutual entropy (the entropy arising from the correlations). The total entropy is constant, the mutual entropy is identified with the thermodynamic entropy, and always increases.
 
  • #34
DrDu said:
I don't know. In the Copenhagen spirit I would argue that the remaining part of the universe (given that it is a compact manifold so that your limit argument makes sense) has to be large enough to define a classical system, i.e. an observer.

What defines an observer, in physical terms? This is an ill-defined notion.

The advantage of my thermal interpretation (see https://www.physicsforums.com/showthread.php?t=490492 ) is that one doesn't need such il--defined terms to make sense of quantum mechanics. Classical systems are simply quantum mechanical systems in which all observables of interest are almost certain in a well-defined sense. And an observer is nowhere needed since the whole physics is observer-independent.
 
  • #35
Rap said:
Yes, and restoring independence at all times amounts to ignoring the correlations that result from collisions.
And ignoring this is the reason for the increase of entropy.
 
  • #36
SpectraCat said:
Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...

If the system is isolated in the sense that all interactions are described by a single basis (that is, nothing is "traced out"; another way to think of this is that everything is evolving under the same Hamiltonian) then there is no "randomizing" and all evolution is unitary (reversible). What you then have is COHERENT DEPHASING of the wave packet made up of vibrational states, not the flow of energy from vibrational modes into the environment that is usually meant when IVR is discussed. The idea that the likelihood of ALL of the energy making its way back to the original mode seems "vanishingly small" might feel correct intuitively because the actual time that it takes for the wave packet to get to a total revival (reacquire exactly its initial state) is proportional to the number of modes making up the wave packet. For rotational wave packets, for example, I believe it goes as the square of the number (I'm not sure about vibrations). Therefore, the larger the system is the LONGER it will take for the revival to occur, but if it's actually isolated in the sense discussed above, it WILL happen and hence, there is no increase in entropy.

As someone else mentioned, another way to think about this is that under unitary evolution the eigenvalue spectrum of the density matrix is preserved at all times. Therefore, in some basis it will always be diagonal with the same "populations", and all that's happening is that the basis set that the system is diagonal in will rotate in time.
 
  • #37
Einstein Mcfly said:
If the system is isolated in the sense that all interactions are described by a single basis (that is, nothing is "traced out"; another way to think of this is that everything is evolving under the same Hamiltonian) then there is no "randomizing" and all evolution is unitary (reversible). What you then have is COHERENT DEPHASING of the wave packet made up of vibrational states, not the flow of energy from vibrational modes into the environment that is usually meant when IVR is discussed. The idea that the likelihood of ALL of the energy making its way back to the original mode seems "vanishingly small" might feel correct intuitively because the actual time that it takes for the wave packet to get to a total revival (reacquire exactly its initial state) is proportional to the number of modes making up the wave packet. For rotational wave packets, for example, I believe it goes as the square of the number (I'm not sure about vibrations). Therefore, the larger the system is the LONGER it will take for the revival to occur, but if it's actually isolated in the sense discussed above, it WILL happen and hence, there is no increase in entropy.

As someone else mentioned, another way to think about this is that under unitary evolution the eigenvalue spectrum of the density matrix is preserved at all times. Therefore, in some basis it will always be diagonal with the same "populations", and all that's happening is that the basis set that the system is diagonal in will rotate in time.

Thanks for the detailed response ... I need to take some time to consider what you have said, and check it against the experimental results from the literature that I was thinking of when I constructed this example. In the meantime, there is just one picky point of order ... in chemical physics, which is the context my example was cast in, IVR does not involve coupling to an environment .. it is not vibrational relaxation. IVR means intramolecular vibrational redistribution ... i.e. the energy stays in the molecule but is redistributed over time throughout the vibrational modes of the molecule.

IVR is not typically described as a coherent process, but that may just be because the recurrence times for molecules of any meaningful size would be orders of magnitude longer than the spontaneous emission rates for IR photons. I know Brooks Pate at Virgina has looked in detail at these kinds of IVR processes, but I haven't looked at those papers for many years. I will review them and see if they agree with your analysis. It does make sense to me that, if the IVR process is just a coherent dephasing, then provided the system does not interact with anything else on the time-scale of the experiment, the von-Neumann entropy of the system should remain constant. I think your reply also makes it clear that my concerns about symmetry degeneracies of the normal modes were something of a red-herring. If the system stays coherent, then by definition it is described by unitary time evolution, so those details will just "work themselves out".
 
  • #38
A. Neumaier said:
And ignoring this (correlations) is the reason for the increase of entropy.

Yes, because the total entropy is constant (due to time reversal symmetry) and the correlation entropy (mutual entropy) is decreasing.
 
  • #39
SpectraCat said:
Thanks for the detailed response ... I need to take some time to consider what you have said, and check it against the experimental results from the literature that I was thinking of when I constructed this example. In the meantime, there is just one picky point of order ... in chemical physics, which is the context my example was cast in, IVR does not involve coupling to an environment .. it is not vibrational relaxation. IVR means intramolecular vibrational redistribution ... i.e. the energy stays in the molecule but is redistributed over time throughout the vibrational modes of the molecule.

IVR is not typically described as a coherent process, but that may just be because the recurrence times for molecules of any meaningful size would be orders of magnitude longer than the spontaneous emission rates for IR photons. I know Brooks Pate at Virgina has looked in detail at these kinds of IVR processes, but I haven't looked at those papers for many years. I will review them and see if they agree with your analysis. It does make sense to me that, if the IVR process is just a coherent dephasing, then provided the system does not interact with anything else on the time-scale of the experiment, the von-Neumann entropy of the system should remain constant. I think your reply also makes it clear that my concerns about symmetry degeneracies of the normal modes were something of a red-herring. If the system stays coherent, then by definition it is described by unitary time evolution, so those details will just "work themselves out".

Of course you're right, IVR doesn't necessarily involve coupling to the environment, however as I described above, if it's not coupled to the environment in any way then it MUST be described as a coherent process (at least if you're interested in the dynamics at long times). The coherence in these dynamics is the key to coherent control process that are being done all over the place using electronic states (Rydberg wave packets), vibrational and rotational wave packets (molecular alignment), including by Brooks Pate. From his website:

"A major emphasis of our work is understanding the spectroscopy of molecules as the IVR process, and possibly reaction, occurs. In particular, we are interested in how coherent excitation of highly excited molecules can be used to influence reaction products."

It's neat stuff, as is the inclusion of dissipative process for larger systems (like proteins and such). Dwayne Miller's group up in Toronto has done some fascinating experiments that show coherent processes in rhodopsin in retinal, as well as Graham Flemings group in Berkley and Greg Engel's group at U of Chicago to name a few.
 
  • #40
A. Neumaier said:
To understand things in an extremely simplified but appropriate setting, consider the Lorenz attractor - it is a time-reversal invariant ordinary differential equation in 3 variables, and nevertheless shows an increase of entropy for every initial ensemble close to the attractor, no matter whether you run the dynamics forward or backward. You can easily program the Lorenz attractor yourself in systems like Matlab or Mathematica, and convince you of the fact that once approximations are made, irreversibility ''follows'' easily from time-reversibility plus chaoticity.

Actually, the Lorenz system is dissipative, and running it backwords gives nonsense. Thus one needs to take a chaotic Hamiltonian system, such as the double pendulum (for values of the energy where this is chaotic).
http://en.wikipedia.org/wiki/Double_pendulum
 
  • #42
So, my current understanding is this:

1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

2) In deriving the H-Theorem, one has to neglect correlations. Since the joint entropy doesn't change, the increase in thermodynamical entropy is compensated by the increase of correlational information.

3) As far as Loschmidt's paradox is concerned, there's no fundamental difference between classical systems and quantum mechanical ones.

My next step is to read the papers provided by Demystifier.

[Also thanks to Einstein Mcfly for resolving the IVR problem. This system sounds interesting indeed. So I have been wrong in assuming that it is a relaxation process.]
 
  • #43
kith said:
So, my current understanding is this:

1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

2) In deriving the H-Theorem, one has to neglect correlations. Since the joint entropy doesn't change, the increase in thermodynamical entropy is compensated by the increase of correlational information.

3) As far as Loschmidt's paradox is concerned, there's no fundamental difference between classical systems and quantum mechanical ones.

Yes, that's it!

Moreover, the dynamics for which the H-theorem is derived is a corse-grained, approximate (though for macroscopic systems quite accurate) dynamics for the state of the system. An exact dynamics would have to keep an infinite memory, and the H-theorem would no longer be derivable.
 
  • #44
kith said:
So, my current understanding is this:
1) In an isolated system, the "total entropy" stays constant at all times. This is true in the classical case (joint entropy = thermodynamical entropy - correlational information) as well as in the quantum case (Von Neumann entropy).

Yes, but realize that you should say "marginal entropy", not "thermodynamical entropy". Joint entropy, marginal entropy, correlation information (or mutual information) are all information-theoretic entropies and are dimensionless. Thermodynamic entropy is marginal entropy times Boltzmann's constant. Boltzmann's constant has dimensions of thermodynamic entropy (e.g. Joule/kelvin).

Correct me if I am wrong, but I think the reason the marginal entropy is the only thermodynamically relevant entropy is because the internal energy of a substance is the thing that we can measure, and only the marginal entropy contributes to the energy. The correlations between the particles do not contribute to the internal energy or any other macroscopic thermodynamic variable. Does this sound correct?
 
  • #45
SpectraCat said:
Thank you for that explanation ... that is the way I understood the math to work in the case where the density matrix is diagonal. The issue with the example that I gave is that the off-diagonal elements of the matrix are time dependent ... i.e. the magnitude of the anharmonic cross-couplings between the vibrational modes are dependent on the population of those modes (by population I mean the number of vibrational quanta in a given mode). I believe the ramifications of this are that you cannot choose a unique basis that diagonalizes the density matrix at all times, but like I said, I am not that well-versed in the details of density matrices, so I may not have that completely correct.

The density matrix is a positive Hermitian operator at all times, thus there is always a basis in which it is diagonal. As you said, if you fix a given basis the size of the off diagonal may vary, but there is always a basis where the density matrix is diagonal. If the evolution is unitary as I described above, then all that happens is that the diagonalizing basis rotates through Hilbert space. The eigenvalues remain fixed for all time.

There is a more general kind of evolution, called evolution via completely positive map, which not only alters the diagonalizing basis but can also changes the eigenvalues. This type of evolution can always be realized as unitary evolution on a larger system containing the system you're interested in. One regards the extended system as system+environment so that the "fine grained" entropy of the system of interest can change because of the coupling to the environment.

However, I have an even more fundamental problem because I do not understand from a physical point of view how the entropy of the isolated vibrational system in the example I gave can possibly be time-invariant. It seems intuitively clear to me from the phenomenology of intramolecular vibrational redistribution that the process is entropically driven. In other words, the energy starts out localized as a single quantum of excitation in a single vibrational mode, and then becomes "randomized" as one or more quanta of excitation in multiple vibrational modes with lower energy. The total internal energy of the system remains constant, but the probability of the energy finding its way back into the mode that was initially excited is (I think) vanishingly small. That seems consistent with the evolution of the state from low entropy (all the energy in a single mode) to higher entropy (the energy redistributed among many modes).

A possible counter-argument to the description I gave above might be that, even though the energy is "randomized", at any instant in time it is described by a unique "pure state" of the system, which would have the same von Neumann entropy (zero) as the initial state with a single quantum of excitation in a single mode. This argument is probably valid for small molecules where the density of states is low, however molecules belonging to symmetric point groups with degenerate irreducible representations have formal degeneracies that would give rise to non-zero entropies for particular combinations of vibrational quanta.

I would appreciate any insights you have on this ...

If the molecular system in question is well approximated by an isolated finite state system, then there will always be recurrences where after a finite time the state comes very close to the initial state. As one makes the system larger and larger these recurrences take longer and longer to occur.

Your argument about the pure state is exactly right. The entire system is in a pure state and evolves unitarily and thus the entropy of the full system is zero at all times. However, entropies of subsystems can certainly change in time. With a sufficiently complex molecule the dynamics of an effective subsystem could appear to relax to a finite entropy state for long periods. For example, suppose one begins with a pure state consisting of an electronic excitation unentangled with the vibrational degrees of freedom. We have S_{e} = 0 (the entropy of the electronic subsystem). As the full system evolves, the electronic degrees of freedom will become entangled with the vibrational degrees of freedom. This can easily result in the appearance of a non-zero S_e for long periods of time.

Hope this helps.
 
  • #46
Rap said:
Yes, but realize that you should say "marginal entropy", not "thermodynamical entropy". Joint entropy, marginal entropy, correlation information (or mutual information) are all information-theoretic entropies and are dimensionless. Thermodynamic entropy is marginal entropy times Boltzmann's constant. Boltzmann's constant has dimensions of thermodynamic entropy (e.g. Joule/kelvin).
In an information theoretic context, one can always set Boltzmann's constant to 1 or ln 2, by choosing the Kelvin appropriately. Thus it is appropriate to equate the two.entropies. (This is like choosing units such that c=1 in relativity or hbar=1 in quantum mechanics.)
Rap said:
Correct me if I am wrong, but I think the reason the marginal entropy is the only thermodynamically relevant entropy is because the internal energy of a substance is the thing that we can measure, and only the marginal entropy contributes to the energy. The correlations between the particles do not contribute to the internal energy or any other macroscopic thermodynamic variable. Does this sound correct?
One can measure various things, but they are all related to the thermodynamical entropy by the standard formalism.
 
  • #47
A. Neumaier said:
In an information theoretic context, one can always set Boltzmann's constant to 1 or ln 2, by choosing the Kelvin appropriately. Thus it is appropriate to equate the two.entropies. (This is like choosing units such that c=1 in relativity or hbar=1 in quantum mechanics.)

Yes. I just wanted to point out that the usual thermodynamic entropy is not identical to the marginal entropy. The marginal entropy is dimensionless, the usual thermodynamic entropy has units of e.g. Joule/Kelvin. I actually prefer to think of temperature as having units of energy, multiplying the change in marginal information entropy to yield the change in internal energy at constant volume and constant particle number.
 
Back
Top