Resources for information on Entropy

Click For Summary
SUMMARY

This discussion focuses on the properties of von Neumann Entropy and its relationship with Shannon Entropy, particularly in the context of quantum information theory. The user references A. Wehrl's article, "General properties of entropy," and explores the complexities of defining entropy in relation to measurement bases. Key points include the definition of mixing entropy and its implications for quantum state transmission, specifically regarding the Quantum Noiseless Coding theorem and the necessity of understanding the basis that diagonalizes the density operator ρ.

PREREQUISITES
  • Understanding of von Neumann Entropy and Shannon Entropy
  • Familiarity with density operators in quantum mechanics
  • Knowledge of quantum information theory, particularly the Quantum Noiseless Coding theorem
  • Basic concepts of measurement bases in quantum systems
NEXT STEPS
  • Research the implications of the Quantum Noiseless Coding theorem on quantum state transmission
  • Explore the concept of mixing entropy and its mathematical formulations
  • Study the relationship between different measurement bases and their impact on entropy calculations
  • Investigate the role of relative phases in quantum information storage and transmission
USEFUL FOR

Students and researchers in quantum physics, particularly those studying quantum information theory, entropy properties, and the mathematical foundations of quantum mechanics.

LukeD
Messages
354
Reaction score
3
Hi PhysicsForums,

I'm writing a term paper on the properties of the von Neumann Entropy, its relation to the Shannon Entropy, and the additional complications present in the von Neumann Entropy that are not present in the Shannon Entropy.

Could someone direct me to articles and books that will be helpful? Thanks!
My professor already directed me towards this article, which has been very helpful: A. Wehrl, "General properties of entropy," Rev. Mod. Phys. 50 (1978) 221.
 
Physics news on Phys.org
No one? In particular, I'm stuck on the relation between the von Neumann entropy and the Shannon entropy, but I think I'm beginning to understand.

Given a mixture of pure states [tex]\rho = \sum_i p_i |\psi_i \rangle\langle \psi_i|[/tex], the Shannon entropy of a measurement of [tex]\rho[/tex] in the basis [tex]{ |\psi_i \rangle\langle \psi_i|}[/tex] is
[tex]S = -\sum_i p_i ln(p_i)[/tex]
(In a book that I am looking through, they call this a "mixing entropy". A "mixing entropy" is defined relative to a basis.)

The von Neumann entropy is the Shannon entropy of [tex]\rho[/tex] in the basis that diagonalizes [tex]\rho[/tex]. So we can consider the von Neumann entropy to be an informational entropy if and only if we are measuring in a special basis. Interestingly, this is the basis that minimizes the mixing entropy.

Although the "mixing entropy" is defined relative to a basis, this seems to be a much more natural measure of information than the von Neumann entropy. We could even define the total "mixing entropy" as being [tex]S(\rho, \{P_1, P_2, ...\})[/tex], a function of both a density operator and a partition of the identity operator into projectors. Although it is a much more complicated beast, this function captures the total informational properties of our system, and if we do away with the entropy's dependence on the measurement, then we lose something.

Does anyone have anything to add that might be helpful? I haven't been able to find any information on entropy defined as a function of both [tex]\rho[/tex] and a basis, so I'd appreciate help in that direction.
 
I'm also a bit confused about the Quantum Noiseless Coding theorem.
We have a source of quantum information specified by a density operator [tex]\rho[/tex] with von Neumann entropy [tex]S(\rho) = -Tr(\rho \text{ln} (\rho))[/tex]. The theorem states that we can transmit the state with fidelity [tex]1-\epsilon[/tex] as long as we have a noiseless channel composed of at least [tex]S(\rho)[/tex] qubits.

But I'm confused because we can only consider the von Neumann entropy to be a Shannon entropy when we know the basis that diagonalizes [tex]\rho[/tex]. Then this tells us the number of classical bits that are needed to describe the system. But in order to transmit a quantum state, the information needs to be transmitted in every basis, not just the one that diagonalizes [tex]\rho[/tex]. Since the von Neumann entropy is the minimum of "mixing entropies" that I defined in my last post, measurements in other bases need more classical bits to describe. How is it that I only need [tex]S(\rho)[/tex] qubits for all types of information to make it through?

Obviously, the resolution is that I am using qubits, not classical bits, but I don't see how this works. Is there maybe some bound on the maximal "mixing entropy" of [tex]\rho[/tex] in some basis that allows me to store the rest of the information in the relative phases of the channel's qubits? In other words, what is the minimum number of classical bits needed to describe the result of measuring [tex]\rho[/tex] in an arbitrary basis? [tex]S(\rho)[/tex] gives the minimum needed in the case that I've picked the most efficient basis.

---

Edit: Sorry, since I used ln instead of log2 in my definition of entropy, that should be nits, not bits.
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 21 ·
Replies
21
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
10K
  • · Replies 6 ·
Replies
6
Views
30K
  • · Replies 3 ·
Replies
3
Views
4K