Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Resources for information on Entropy

  1. May 8, 2010 #1
    Hi PhysicsForums,

    I'm writing a term paper on the properties of the von Neumann Entropy, its relation to the Shannon Entropy, and the additional complications present in the von Neumann Entropy that are not present in the Shannon Entropy.

    Could someone direct me to articles and books that will be helpful? Thanks!
    My professor already directed me towards this article, which has been very helpful: A. Wehrl, "General properties of entropy," Rev. Mod. Phys. 50 (1978) 221.
     
  2. jcsd
  3. May 9, 2010 #2
    No one? In particular, I'm stuck on the relation between the von Neumann entropy and the Shannon entropy, but I think I'm beginning to understand.

    Given a mixture of pure states [tex]\rho = \sum_i p_i |\psi_i \rangle\langle \psi_i|[/tex], the Shannon entropy of a measurement of [tex]\rho[/tex] in the basis [tex]{ |\psi_i \rangle\langle \psi_i|}[/tex] is
    [tex] S = -\sum_i p_i ln(p_i) [/tex]
    (In a book that I am looking through, they call this a "mixing entropy". A "mixing entropy" is defined relative to a basis.)

    The von Neumann entropy is the Shannon entropy of [tex]\rho[/tex] in the basis that diagonalizes [tex]\rho[/tex]. So we can consider the von Neumann entropy to be an informational entropy if and only if we are measuring in a special basis. Interestingly, this is the basis that minimizes the mixing entropy.

    Although the "mixing entropy" is defined relative to a basis, this seems to be a much more natural measure of information than the von Neumann entropy. We could even define the total "mixing entropy" as being [tex]S(\rho, \{P_1, P_2, ...\})[/tex], a function of both a density operator and a partition of the identity operator into projectors. Although it is a much more complicated beast, this function captures the total informational properties of our system, and if we do away with the entropy's dependence on the measurement, then we lose something.

    Does anyone have anything to add that might be helpful? I haven't been able to find any information on entropy defined as a function of both [tex]\rho[/tex] and a basis, so I'd appreciate help in that direction.
     
  4. May 9, 2010 #3
    I'm also a bit confused about the Quantum Noiseless Coding theorem.
    We have a source of quantum information specified by a density operator [tex]\rho[/tex] with von Neumann entropy [tex]S(\rho) = -Tr(\rho \text{ln} (\rho))[/tex]. The theorem states that we can transmit the state with fidelity [tex]1-\epsilon[/tex] as long as we have a noiseless channel composed of at least [tex]S(\rho)[/tex] qubits.

    But I'm confused because we can only consider the von Neumann entropy to be a Shannon entropy when we know the basis that diagonalizes [tex]\rho[/tex]. Then this tells us the number of classical bits that are needed to describe the system. But in order to transmit a quantum state, the information needs to be transmitted in every basis, not just the one that diagonalizes [tex]\rho[/tex]. Since the von Neumann entropy is the minimum of "mixing entropies" that I defined in my last post, measurements in other bases need more classical bits to describe. How is it that I only need [tex]S(\rho)[/tex] qubits for all types of information to make it through?

    Obviously, the resolution is that I am using qubits, not classical bits, but I don't see how this works. Is there maybe some bound on the maximal "mixing entropy" of [tex]\rho[/tex] in some basis that allows me to store the rest of the information in the relative phases of the channel's qubits? In other words, what is the minimum number of classical bits needed to describe the result of measuring [tex]\rho[/tex] in an arbitrary basis? [tex]S(\rho)[/tex] gives the minimum needed in the case that I've picked the most efficient basis.

    ---

    Edit: Sorry, since I used ln instead of log2 in my definition of entropy, that should be nits, not bits.
     
    Last edited: May 9, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Resources for information on Entropy
Loading...