Von Neumann Entropy: Temperature & Info Explained

In summary, thermodynamic entropy and von Neumann entropy are two different measures of information, with the former being related to temperature through the Gibbs equilibrium state, and the latter being applicable to all states. The relationship between information theory and thermodynamics is explained by the equality of Shannon and Boltzmann entropy, as described by Jaynes. However, there is no physical notion of temperature for non-equilibrium states, making the statement that thermodynamic entropy equals von Neumann entropy misleading. The erasure of quantum information by thermalization also shows the connection between temperature and von Neumann entropy.
  • #1
touqra
287
0
What's the difference between thermodynamic entropy and von Neumann entropy? In particular, how is temperature related to the von Neumann entropy?
Also, what has information got to do with these two entropies?
 
Physics news on Phys.org
  • #2
touqra said:
What's the difference between thermodynamic entropy and von Neumann entropy? In particular, how is temperature related to the von Neumann entropy?
Also, what has information got to do with these two entropies?
Hi, thermodynamic entropy is the entropy of the Gibbs equilibrium state
s := exp( - beta H)/trace(exp( - beta H)/) where beta is to be interpreted as the inverse temperature and H as the Hamiltonian. The Von Neumann entropy equals - trace (s log s). It is just that this last formula makes sense for ALL states (ie. semi positive definite matrices with trace equal to unity). Therefore Von Neumann entropy does NOT relate in general to any meaningful notion of temperature.

Cheers,

Careful
 
  • #3
Thermodynamic entropy is the same as Shannon Entropy (see http://en.wikipedia.org/wiki/Shannon_entropy) [Broken], and it is a measure of how much information is encoded in a probability distribution. The relationship between Information Theory and Thermodynamics indicated by the equality of Shannon and Boltzmann entropy were beautifully described in the papers of Jaynes (see http://en.wikipedia.org/wiki/Edwin_Jaynes) [Broken].

The von Neumann entropy is the QM analogous and its relation to Quantum Information Theory is explained in a very simple way in a paper by Plenio and Vitelli that you can find in

http://arxiv.org/abs/quant-ph/0103108
 
Last edited by a moderator:
  • #4
Alamino said:
Thermodynamic entropy is the same as Shannon Entropy (see http://en.wikipedia.org/wiki/Shannon_entropy) [Broken], and it is a measure of how much information is encoded in a probability distribution. The relationship between Information Theory and Thermodynamics indicated by the equality of Shannon and Boltzmann entropy were beautifully described in the papers of Jaynes (see http://en.wikipedia.org/wiki/Edwin_Jaynes) [Broken].

The question touqra posed was how Von Neumann entropy relates to a *physical* temperature. There exists no good physical notion of temperature for a general state which is not a thermal equilibrium (ie. Gibbs) state AFAIK. Therefore, the statement that thermodynamic entropy equals Von Neumann entropy is extremely misleading to say the very least.
 
Last edited by a moderator:
  • #5
I don't remembered saying "that thermodynamic entropy equals Von Neumann entropy"... I said "analogous", what can be esily seen by the similarity of both formulas. And I remembered that he also asked about the relation of both entropies with respect to information, which was what my post was about.

Anyway, the last paper I indicated talks about the erasure of quantum information by thermalization and indicates where temperature enters in this matter (particualrly, look at equation (47)).
 

1. What is Von Neumann Entropy?

Von Neumann Entropy is a measure of the randomness or disorder in a quantum system. It is also known as the Shannon entropy and is calculated using the density matrix of the system.

2. How is Von Neumann Entropy related to temperature?

In statistical mechanics, the Von Neumann Entropy is directly related to the temperature of a system through the Boltzmann constant. As the temperature of a system increases, the level of disorder and entropy also increases.

3. What is the significance of Von Neumann Entropy in information theory?

In information theory, Von Neumann Entropy is used to quantify the amount of uncertainty or unpredictability in a system. It is a measure of the average amount of information needed to describe the state of a system.

4. Can Von Neumann Entropy be negative?

No, Von Neumann Entropy can never be negative. It is always a non-negative value, with a minimum value of zero for a completely ordered or pure state.

5. How is Von Neumann Entropy calculated?

Von Neumann Entropy is calculated using the density matrix of a quantum system. The formula for calculating it is S = -Tr(ρ ln ρ), where S is the entropy and ρ is the density matrix.

Similar threads

Replies
1
Views
595
  • Quantum Physics
Replies
2
Views
843
  • Quantum Physics
Replies
3
Views
1K
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
4
Views
2K
Replies
2
Views
1K
  • Quantum Physics
Replies
1
Views
820
  • Beyond the Standard Models
Replies
20
Views
729
Replies
1
Views
1K
Replies
1
Views
1K
Back
Top