Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Featured Insights Entanglement Entropy - Part 1: Quantum Mechanics - Comments

  1. Apr 12, 2017 #1

    ShayanJ

    User Avatar
    Gold Member

  2. jcsd
  3. Apr 12, 2017 #2

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Very nice. I think this is a really important topic, when it comes to understanding quantum mechanics. I'm beginning to think that entanglement entropy is not just analogous to the entropy that almost always increases in classical thermodynamics, but that they might in some sense be the same thing. That is, I wonder whether the second law of thermodynamics can be understood in terms of entanglement?
     
  4. Apr 12, 2017 #3
    Isn't the author kind of suggesting a hidden variable theory? He says that if an observer only sees a part of the system he is losing information that is contained in the state of the whole system. (I hope I did not mess that up) Here are his own words

     
  5. Apr 12, 2017 #4

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Let's look at a particular example, namely EPR. It's known ahead of time that Alice's particle and Bob's particle are correlated: If Alice measures spin-up along some axis [itex]\vec{a}[/itex], then Bob will definitely measure spin-down along that axis.

    But if you try to give separate states for Alice's particle and Bob's particle, you would have to say:
    1. For any direction [itex]\vec{a}[/itex], Alice's particle has probability 50% of being spin-up along that axis.
    2. For any direction [itex]\vec{b}[/itex], Bob's particle has probability 50% of being spin-up along that axis.
    But those two statements involves throwing the information that Alice's and Bob's particles are correlated. Whether you consider a correlation to be a "hidden variable" or not is a matter of terminology, but it's definitely not a local hidden variable.
     
  6. Apr 12, 2017 #5
    I think you could:
    The second laws of quantum thermodynamics
    https://arxiv.org/abs/1305.5278

    Instead of just the von Neumann entropy, their result utilizes an infinite family of the Renyi entropies, which apparently are also entanglement measures. (Classically, the Renyi entropies are generalizations of the Shannon entropy and can be derived by postulating some reasonable axioms that the Shannon entropy satisfies.) There may be other approaches, but I know about this one because it is very much quantum-information-theoretic. I don't know it in details but if people are interested, we could dive into it (maybe in another thread).
     
  7. Apr 12, 2017 #6

    ShayanJ

    User Avatar
    Gold Member

    Well, its an appealing line of thinking and there are some similarities. But its not that straightforward. The general belief is that entanglement entropy contains thermal entropy for a thermal state. And its important to mention that entanglement entropy is not extensive! In the next part, I'll calculate entanglement entropy for a thermal state of a field theory and you'll see that although its not extensive, its high temperature limit is!
    Also people have explored laws about entanglement entropy that resemble the laws of thermodynamics. But because the methods used in QFT are too hard, people usually use holographic arguments, e.g. this paper.
    My goal is to have a series that starts with Entanglement Entropy, then goes to Holography and then to Holographic methods to calculate the Entanglement Entropy of Quantum Field Theories. But I'm just a master's student working on this so its only going to be an introduction.
     
  8. Apr 13, 2017 #7

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    That's a very nice Insight article. The only thing I'd make more clear is the definition of the reduced operator. It's a partial trace. Using your notation, the Hilbert space of the total system is spanned by the Kronecker-product basis
    $$|\Phi(n_a,n_b) \rangle=|\phi_{n_a}^a \rangle \otimes |\phi_{n_b}^b \rangle.$$
    Then the reduced statistical operator for subsystem ##a## is given by tracing out subsystem ##b##, i.e.,
    $$\hat{\rho}_a=\mathrm{Tr}_{b} \hat{\rho} := \sum_{n_a,n_a'n_b} |\Phi(n_a,n_b)\rangle \langle \Phi(n_a,n_b)|\hat{\rho}|\Phi(n_a',n_b) \rangle \langle \Phi(n_a',n_b)|.$$
     
  9. Apr 13, 2017 #8
    I've gotten some feedback requesting a greater explanation of the math and terminology of the entropy section.
     
  10. Apr 14, 2017 #9

    Mark Harder

    User Avatar
    Gold Member

    At the macroscopic level, one can identify entropy with a kind of degeneracy in the system. If there are multiple microstates consistent with the total energy and other conserved quantities, then S = k ln(number of degenerate microstates). What you seem to be saying about QM statistical mechanics is that when the one particle decays into 2, the process generates a 3rd state, the entangled one. Since the component states and the entangled one are degenerate (each is reachable from the others, with some probability), the entropy increases from the single particle state to the 2 produced particle states and still further when the entangled state is taken into account. Is this a reasonable interpretation of what you said?
     
  11. Apr 14, 2017 #10

    ShayanJ

    User Avatar
    Gold Member

    I added some extra explanation.
     
  12. Apr 14, 2017 #11

    ShayanJ

    User Avatar
    Gold Member

    No, its not! I wasn't talking about classical or quantum statistical mechanics. Its just quantum mechanics.
    In statistical mechanics, entropy is a measure of the information lost to us as the result of ignoring the exact state of the system. Entanglement Entropy is a measure of the amount of information contained in the state of the whole, that is not contained in the "state" of each of the constituents. So its a measure of the information lost as a result of not being able to examine the whole system. It doesn't have anything to do with the size of the system, but statistical entropy is about the size of the system since we ignore the exact state of the system exactly because its a lot of information. The information is there, we just ignore it! But in case of entanglement entropy, we're actually measuring how much information the whole system can give us about a subsystem, that the subsystem itself is fundamentally unable to.
     
  13. Apr 14, 2017 #12

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    It's the information-theoretical approach, which made me understand what's entropy in the first place. A very good book on the information-theoretical approach to both classical and quantum statistical physics is

    A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London, 1967.
     
  14. Apr 15, 2017 #13

    Simon Phoenix

    User Avatar
    Science Advisor
    Gold Member

    Nice article, Shayan.

    I've pondered the use of entropy in QM for many years and still don't feel I've really got the hang of it too well. I've always found Shannon's formulation and von Neumann's quantum generalization of it to be rather elegant and fundamental. For me, though, it isn't the entanglement entropy, per se, that's important but rather the ##mutual## information - of course for pure states of bipartite systems the entanglement entropy and mutual information are proportional to one another.

    If we have 2 quantum systems ##A## and ##B## with total entropy ##S## and reduced entropies ##S_A## and ##S_B## then they are related by the Araki-Lieb inequality $$ \left| S_A - S_B \right| \leq S \leq S_A + S_B $$The RHS of this inequality is of course that of classical systems - the entropy of the whole must be less than or equal to the sum of the entropies of its constituents. The LHS is where the quantum magic comes in. For pure states of the combined ##AB## system the von Neumann entropy is zero so that in any (combined) pure state of 2 systems the quantum entropies of the 2 component pieces are equal, that is, ##S_A = S_B##.

    The mutual information is a measure of the 'information content' of the correlation. In other words, if we only did measurements on the 2 systems alone it is the amount of information we would miss by not considering joint properties. With the mutual information defined as ##I = S_A + S_B - S## then using the AL inequality it's easy to show that $$ I \leq 2 \text {inf} \left\{ S_A , S_B \right\} $$The maximum is obtained when the smaller component system is 'maximally' mixed. The classical version of the AL inequality would be $$ \rm {sup} \left\{ S_A , S_B \right\} \leq S \leq S_A + S_B $$ so that the total entropy, classically, can't be less than the entropy of either of its constituents.

    For EM fields, the 2-mode squeezed state can be considered to be a purification of the single mode thermal state - and the mutual information formalism tells us that the two-mode squeezed state is the most strongly correlated state of 2 modes of the EM field, subject to a mean energy constraint.

    If we generalize this measure of correlation to multipartite quantum systems (eg, ## I = S_A + S_B + S_C - S##) then some nice general properties can be derived for the evolution of correlations under unitary evolutions using only very elementary methods. The nice thing about this generalization to multipartite systems is that the mutual information is the only sensible measure that satisfies some reasonable properties - for example, if we had 2 uncorrelated (unentangled) systems ##A## and ##B##, each comprised of component parts, then any measure of correlation of the combined ##AB## system should just give the sum of the amount of correlation ##within## each of ##A## and ##B##.

    I still think there's more insight to be gained from the use of entropy/information in QM - but it will take a more talented person than me to figure it out o0)
     
  15. Apr 16, 2017 #14

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    I just wanted to expand on what @Simon Phoenix said about mutual information. I'm not exactly sure how to understand the meaning of Von Neumann entropy in the weird cases where it's not a positive number (classically, entropy is always positive, and represents roughly the number of bits of information to completely describe the complete situation). But it is a very stark way to show the difference between classical probability and quantum probability.

    The idea of mutual information is shown in figure 1, below. If you have a composite system (and I've shown it as composed of a system each for experimenters Alice and Bob), then the information [itex]S[/itex] for the composite system can be broken down into three parts:
    1. The mutual information relevant to both Alice and Bob, [itex]S(A:B)
    2. The information associated with Alice's measurement that isn't included in the mutual information.
    3. The information associated with Bob's measurement.
    If Alice ignores Bob, then her subsystem is characterized by [itex]S(A) = S(A|B) + S(A:B)[/itex]. Similarly, Bob's subsystem ignoring Alice is characterized by [itex]S(B) = S(B|A) + S(A:B)[/itex]. The various pieces of information fit together into a Venn diagram, where [itex]S(A:B)[/itex] is the overlap.

    entanglement-entropy.jpg

    Coin flips
    The simplest example is a coin flip: Alice and Bob each flip a coin, and get "heads" or "tails". Alice's result says nothing about Bob's, and vice-versa, so the mutual information is 0. Each coin separately has an entropy of 1 bit. So the total entropy is 2 bits. This situation is shown in Figure 2.

    A pair of shoes
    Another classical example is a pair of shoes. Imagine that you have a pair of shoes and randomly pick one to send to Alice and send the other to Bob. If Alice ignores Bob, then this seems just like a coin flip: she gets one of two possibilities, that are equally likely. So [itex]S(A) = 1[/itex]. Similarly, [itex]S(B) = 1[/itex]. But in this case, Alice's result (left or right) tells her exactly what Bob's result will be (the opposite). There is only mutual information, and no information for Alice independent of Bob independent of Alice. So [itex]S = S[A:B] = 1[/itex].

    EPR
    This is the essentially quantum situation. Alice and Bob each measure the spin of one of a pair of correlated particles along some axis. If Alice ignores Bob, then again her measurement seems like a coin flip: she gets one of two possible results, with equal probability. Similarly for Bob. So again [itex]S(A) = S(B) = 1[/itex]. But in this case, the total information is 0! There is only one possibility for a pair of entangled particles, so the composite information [itex]S[/itex] is zero. (Which is what the Von Neumann entropy gives for the composite two-particle state). But these two facts, plus the definitions of mutual information and conditional information lead us to absolutely weird conclusions:
    1. The mutual information [itex]S(A:B)[/itex] is 2.
    2. The conditional information for Alice and Bob is [itex]S(A|B) = S(B|A) = -1[/itex]
    Neither of these make any sense, classically. It says that of the information associated with Alice's measurement (1 bit), 2 bits of it is associated with the mutual information shared by Alice and Bob, and -1 bit is private to Alice. How can information be negative? How can the information for a subsystem be more than the information for the composite system? The mathematics seems to work, but it's hard to understand it, intuitively.
     
  16. Apr 17, 2017 #15

    Simon Phoenix

    User Avatar
    Science Advisor
    Gold Member

    Yes and the issue with the negativity of the conditional entropy is one of the reasons (amongst many) that I struggle with the interpretation of quantum entropies. Classically we have $$ I(A;B) = S(A) + S(B) - S = S(A) - S(A|B)$$ and one can define a quantum conditional entropy via $$ S(A|B) = S - S(B) $$but this no longer has the intuitive classical meaning along the lines of "the uncertainty in ##A## given knowledge of ##B##" precisely because this 'conditional' entropy can be negative. The von Neumann entropy ## - \mathbf {Tr} \left( \rho \rm ln \rho \right)## is, of course, always greater than or equal to zero.

    The very appealing classical interpretation of information as a difference of uncertainties is not quite so straightforward in the quantum generalization.
     
  17. Apr 19, 2017 #16
    Great Insight! Looking forward to Part 2!
     
  18. May 3, 2017 #17
    I have difficulties to understand the details of the topic "between the lines". Please can you give a simple example - maybe using a graphical representation - illustrating the bra-ket mathematics in this section.
    rhkail
     
  19. May 3, 2017 #18

    ShayanJ

    User Avatar
    Gold Member

    It works better if you ask specific questions. In which line it stopped making sense to you?
     
  20. May 5, 2017 #19
    I cannot figure out how the spins are encoded in a preferential direction (vector n). An example or a graphical representation in the complex plane could be helpful.
    Moreover, why is the density operator rho not described by a matrix?
    rhkail
     
  21. May 5, 2017 #20
    "The arrow of time is an arrow of increasing correlations (with the surroundings)", Seth Lloyd said. So the von Neumann entropy can be considered as being consistent with the entanglement arrow of time, as Sandu Popescu et al showed 2009.
    rhkail
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted