Resources for information on Entropy

In summary, the conversation on PhysicsForums discusses the properties of von Neumann entropy and its relation to Shannon entropy. The von Neumann entropy is the Shannon entropy of a measurement of a mixture of pure states in the basis that diagonalizes the states. The "mixing entropy" is defined relative to a basis and is considered a more natural measure of information. The Quantum Noiseless Coding theorem states that a quantum state can be transmitted with fidelity 1-\epsilon using S(\rho) qubits, but this raises questions about the number of classical bits needed to describe the result of measuring \rho in an arbitrary basis.
  • #1
LukeD
355
3
Hi PhysicsForums,

I'm writing a term paper on the properties of the von Neumann Entropy, its relation to the Shannon Entropy, and the additional complications present in the von Neumann Entropy that are not present in the Shannon Entropy.

Could someone direct me to articles and books that will be helpful? Thanks!
My professor already directed me towards this article, which has been very helpful: A. Wehrl, "General properties of entropy," Rev. Mod. Phys. 50 (1978) 221.
 
Physics news on Phys.org
  • #2
No one? In particular, I'm stuck on the relation between the von Neumann entropy and the Shannon entropy, but I think I'm beginning to understand.

Given a mixture of pure states [tex]\rho = \sum_i p_i |\psi_i \rangle\langle \psi_i|[/tex], the Shannon entropy of a measurement of [tex]\rho[/tex] in the basis [tex]{ |\psi_i \rangle\langle \psi_i|}[/tex] is
[tex] S = -\sum_i p_i ln(p_i) [/tex]
(In a book that I am looking through, they call this a "mixing entropy". A "mixing entropy" is defined relative to a basis.)

The von Neumann entropy is the Shannon entropy of [tex]\rho[/tex] in the basis that diagonalizes [tex]\rho[/tex]. So we can consider the von Neumann entropy to be an informational entropy if and only if we are measuring in a special basis. Interestingly, this is the basis that minimizes the mixing entropy.

Although the "mixing entropy" is defined relative to a basis, this seems to be a much more natural measure of information than the von Neumann entropy. We could even define the total "mixing entropy" as being [tex]S(\rho, \{P_1, P_2, ...\})[/tex], a function of both a density operator and a partition of the identity operator into projectors. Although it is a much more complicated beast, this function captures the total informational properties of our system, and if we do away with the entropy's dependence on the measurement, then we lose something.

Does anyone have anything to add that might be helpful? I haven't been able to find any information on entropy defined as a function of both [tex]\rho[/tex] and a basis, so I'd appreciate help in that direction.
 
  • #3
I'm also a bit confused about the Quantum Noiseless Coding theorem.
We have a source of quantum information specified by a density operator [tex]\rho[/tex] with von Neumann entropy [tex]S(\rho) = -Tr(\rho \text{ln} (\rho))[/tex]. The theorem states that we can transmit the state with fidelity [tex]1-\epsilon[/tex] as long as we have a noiseless channel composed of at least [tex]S(\rho)[/tex] qubits.

But I'm confused because we can only consider the von Neumann entropy to be a Shannon entropy when we know the basis that diagonalizes [tex]\rho[/tex]. Then this tells us the number of classical bits that are needed to describe the system. But in order to transmit a quantum state, the information needs to be transmitted in every basis, not just the one that diagonalizes [tex]\rho[/tex]. Since the von Neumann entropy is the minimum of "mixing entropies" that I defined in my last post, measurements in other bases need more classical bits to describe. How is it that I only need [tex]S(\rho)[/tex] qubits for all types of information to make it through?

Obviously, the resolution is that I am using qubits, not classical bits, but I don't see how this works. Is there maybe some bound on the maximal "mixing entropy" of [tex]\rho[/tex] in some basis that allows me to store the rest of the information in the relative phases of the channel's qubits? In other words, what is the minimum number of classical bits needed to describe the result of measuring [tex]\rho[/tex] in an arbitrary basis? [tex]S(\rho)[/tex] gives the minimum needed in the case that I've picked the most efficient basis.

---

Edit: Sorry, since I used ln instead of log2 in my definition of entropy, that should be nits, not bits.
 
Last edited:

What is entropy and why is it important to study?

Entropy is a measure of the disorder or randomness of a system. It is important to study because it helps us understand the natural processes that occur in our world, from the flow of energy to the organization of complex systems.

Where can I find reliable resources for information on entropy?

There are many reliable resources for information on entropy, including scientific journals, textbooks, and online databases such as ScienceDirect, JSTOR, and Google Scholar. You can also consult with experts in the field or attend conferences and workshops on the topic.

What are some practical applications of entropy?

Entropy has many practical applications, including in thermodynamics, information theory, ecology, and economics. It is also used in fields such as engineering, chemistry, and biology to understand and predict the behavior of complex systems.

How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This is because energy tends to spread out and become more disorganized, leading to an increase in entropy. Therefore, the study of entropy is closely related to the second law of thermodynamics.

Are there any controversies or debates surrounding the concept of entropy?

Yes, there are ongoing debates and controversies in the scientific community about the definition and interpretation of entropy. Some argue that entropy is a purely physical concept, while others suggest it also has implications for fields such as psychology and economics. Additionally, there are ongoing discussions about the role of entropy in the creation and evolution of the universe.

Similar threads

  • Quantum Physics
Replies
3
Views
1K
  • Quantum Physics
Replies
1
Views
838
Replies
6
Views
756
Replies
1
Views
1K
Replies
1
Views
2K
  • Quantum Physics
Replies
1
Views
697
  • Programming and Computer Science
Replies
23
Views
1K
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
4
Views
3K
  • Thermodynamics
Replies
1
Views
1K
Back
Top