# Entropy in quantum mechanics and second law of thermodynamics

• tom.stoer
In summary, the conversation discusses the definition of entropy and its time evolution in quantum mechanics. The speakers also touch on the concept of subsystems and reduced density operators, and the role of unitary evolution in preserving pure states. They also mention the increase of entropy in classical mechanics and the connection between entropy increase and mixing.

#### tom.stoer

Can you please tell me were I am wrong?

I define the entropy (as usual) as

$$S[\rho] = -k_B \text{tr}(\rho\ln\rho)$$

The time development of the density operator is

$$\rho(t) = U(t)\rho(0) U^\dagger(t)$$

That means

$$S[\rho(t)] = S[U(t)\rho(0) U^\dagger(t)] = S[\rho(0)]$$

where I used the invariance of any trace of the density operator w.r.t. a general unitary transformation (here: time translation)

But that means that the entropy my system is always constant

$$\frac{dS}{dt} = 0$$

Where is my mistake?

The mistake is obvious.

$$\mbox{tr}\left(\rho (t) \ln\rho (t)\right) =\mbox{tr}\left[U(t) \rho (0) U^{\dagger} (t) \left[\ln U(t) + \ln\rho (0) + \ln U^{\dagger} (t)\right]\right] \neq \mbox{tr} \left(\rho(0) \ln \rho(0)\right)$$

Naively I would define ln(A) as a power series in terms of A (just like for any other function f(A)); but as soon as I do this the equality is trivial. So why is this power series expansion not valid here?

If I do just a change of basis vectors generated by a unitary U there MUST not be a U dependence; but the algebraic properties do not depend on the specific chaoice for U (namely that's the time translation); it's valid for every U.

And I mean, you have not shown that the two expressions are indeed different; you just stopped calculating :-)

Last edited:
Let's look at a density operator

$$\rho = \sum_n p_n|n\rangle\langle n|$$

where n is an energy eigenstate. That means that the time evolution of the states is trivial

$$|n,t\rangle = e^{-iHt}|n\rangle = e^{-iE_nt}|n\rangle$$

and that the density operator is constant

$$\rho(t) = \rho(0)$$

That means that in case of a purely unitary time-evolution entropy is always constant

$$\frac{dS}{dt} = 0$$

That means that the only way to get a non-constant entropy is to have a non-unitary time-evolution with

$$\rho(t) = \sum_n p_n(t)|n\rangle\langle n|$$

where again I assume time-independent H, i.e. trivial time-evolution of the states.

Is it still possible to derive

$$\frac{dS}{dt} \ge 0$$

and what are "reasonable" assumptions or conditions for the time-dependence of the probabilities?

I think you might need to look at some reduced density matrices.

I don't want to talk about subsystems and reduced density operators; for subsystems and entanglement, energy and particle exchange, ... it's clear that I may get time-dependent p(t).

I want to look at the whole (isolated) quantum system.

If this quantum system has been prepared in a non-equilibrium state I know that it's entropy will grow (classically). How can I proof this in quantum mechanics?

Well, we know that an isolated system always evolves unitarily.

Among the foundations of the quantum statistical mechanics, there is the random-phase postulate. This means that we have no knowledge on the many-particle correlation, which amounts to looking at the one-particle reduced density matrix.

-Well, this statement isn't correct, as the 'random-phase' means the phase between different eigenstates of the whole system, but not between different single-particle levels. I guess I need to think more about this.

Last edited:
tom.stoer said:
Can you please tell me were I am wrong?

I define the entropy (as usual) as

$$S[\rho] = -k_B \text{tr}(\rho\ln\rho)$$

The time development of the density operator is

$$\rho(t) = U(t)\rho(0) U^\dagger(t)$$

That means

$$S[\rho(t)] = S[U(t)\rho(0) U^\dagger(t)] = S[\rho(0)]$$

where I used the invariance of any trace of the density operator w.r.t. a general unitary transformation (here: time translation)

But that means that the entropy my system is always constant

$$\frac{dS}{dt} = 0$$

Where is my mistake?

There is no mistake. Unitary evolution implies that pure states remain pure, and that entropy is constant.

The increase of entropy is a property of systems in which the joint dynamics with the environment is integrated out, so that a dissipative dynamics for the system alone results, described by a Lindblad equation rather than the von Neumann equation.
Here entropy increases until equilibrium is reached.

tom.stoer said:
I don't want to talk about subsystems and reduced density operators; for subsystems and entanglement, energy and particle exchange, ... it's clear that I may get time-dependent p(t).

I want to look at the whole (isolated) quantum system.

If this quantum system has been prepared in a non-equilibrium state I know that it's entropy will grow (classically). How can I proof this in quantum mechanics?
See S. Weinberg, The Quantum Theory of Fields I, Sec. 3.6, pages 150-151.

Already in classical mechanics Liouvilles theorem proves that microscopic entropy remains constant with time for an isolated system as long as you describe the motion of all particles with Newtonian dynamics. You need some mechanism like coarse graining or consider an explicitly irreversible equation like Boltzmann's equation to obtain an entropy increase. The problem is that a finite isolated system will always return to its initial configuration in phase space if you wait long enough (Zemelo's paradox?). However, this time becomes very large already for quite small systems so that you should take into account the coupling to the surrounding, however weak it may be.

A. Neumaier said:
There is no mistake. Unitary evolution implies that pure states remain pure, and that entropy is constant.
My assumption is slightly more general. I can start with a mixed state (but still unitary evolution).

tom.stoer said:
My assumption is slightly more general. I can start with a mixed state (but still unitary evolution).

My second statement didn't assume the first as hypothesis, but was independent of it.
Your argument is well-known (and probably already in von Neumann's 1932 book).

There is a connection between the two: Entropy increase is typically accompanied by mixing, i.e. the fact that pureness is not preserved by the evolution. This is always the case when the environment is properly taken into account and the Markov approximation is made to remove the memory of past environment influence.

However, there are more approximate models involving Hamiltonians with an (antihermitian) optical potential where there is no mixing but entropy typically still increases.

Fine, thanks. So what is the most general framework (assumptions regarding H etc.) in which dS/dt ≥ 0 can be proven?

tom.stoer said:
Fine, thanks. So what is the most general framework (assumptions regarding H etc.) in which dS/dt ≥ 0 can be proven?

It is called a quantum dynamical semigroup, a concept derived and well-studied in the decade before 1980. The generator can (under very mild assumptions) be put in the form of a so-called Lindblad operator You can find lots of papers about them using http://scholar.google.com or http://xxx.lanl.gov/multi?group=physics&/find=Search

Last edited by a moderator:
tom.stoer said:
Fine, thanks. So what is the most general framework (assumptions regarding H etc.) in which dS/dt ≥ 0 can be proven?
Have you seen my post #9?

Thanks again

Demystifier said:
Have you seen my post #9?
yes, thanks; I'll check that

tom.stoer said:
yes, thanks; I'll check that

Weinberg derives a classical master equation involving quantum transition processes.
This is (but Weinberg doesn't make the connection) the diagonal part of a corresponding quantum master equation, which is governed by a Lindblad operator.

## 1. What is entropy in quantum mechanics?

Entropy in quantum mechanics is a measure of the level of disorder or randomness in a quantum system. It is a fundamental concept in thermodynamics and is also used in quantum information theory to quantify the amount of uncertainty in a system.

## 2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system cannot decrease over time. This means that in any physical process, the total amount of disorder or randomness in the system will either remain constant or increase. Therefore, entropy is closely related to the second law of thermodynamics as it provides a quantitative measure of the level of disorder in a system.

## 3. Can entropy be reversed in quantum systems?

No, the second law of thermodynamics states that entropy can only increase or remain constant, but it cannot decrease. This is also true for quantum systems, where the laws of thermodynamics still apply. However, in certain cases, it is possible to reduce the entropy of a system by transferring it to a larger system, but the total entropy of the combined systems will still increase.

## 4. How does the concept of quantum entanglement affect entropy?

Quantum entanglement is a phenomenon in which two or more particles become connected in such a way that the state of one particle is dependent on the state of the other(s), even when they are separated by large distances. This can lead to a decrease in the entropy of the system, as the particles become more ordered and correlated. However, the overall entropy of the system will still follow the second law of thermodynamics.

## 5. Can entropy be used to predict the behavior of quantum systems?

Entropy can be used to make probabilistic predictions about the behavior of quantum systems. However, due to the inherent randomness and uncertainty of quantum mechanics, it cannot be used to make precise predictions about the exact state or behavior of a system. Entropy is a statistical measure that helps us understand the overall behavior of a large number of quantum particles, rather than individual particles.