1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Information Conservation

  1. Mar 9, 2017 #1
    I have read that "Information is conserved", or at least that it should be.
    I read this in some discussion about whether or not black holes destroy information.
    (ref: Leonard Susskind, Stephen Hawking, The Black Hole War, etc..)
    It is believed systems in distinct states evolve into distinct states,
    so that from a current state we can deduce uniquely the previous state.

    OK. But how does this fit with the wavefunction collapse in quantum mechanics?
    When the wave function collapse, information is lost, isn't it?.
    If a particle's spin is measured and collapse along z+, we can not from z+ deduce the previous orientation.

    So, does the wavefunction collapse violate the conservation of information?
     
  2. jcsd
  3. Mar 9, 2017 #2

    anorlunda

    Staff: Mentor

    If we have a quantum system with n states, then unitarity requires that time evolutions keep the number of states as n; never more than n or less than n. In that sense, it is the quantity of information that is conserved.

    Knowing which state it is actually in, should be called knowledge, not information. Knowledge can be gained or lost.

    Another way to state conservation of information is that the sum of the probabilities of all possibilities must add up to exactly 1, never more, never less.

    It appears to me that most confusion on this topic comes from dictionary definitions that say information is a synonym for knowledge. In this context, they are not the same thing.
     
  4. Mar 11, 2017 #3

    BillTre

    User Avatar
    Science Advisor

    So, is there more to a definition of knowledge WRT these systems?

    Can one put a number on knowledge, in some way, as with information and entropy?
     
  5. Mar 11, 2017 #4

    anorlunda

    Staff: Mentor

    My Professor, Leonard Susskind, like to use a six-sided die (one out of of a pair of dice) as an example.

    The die can have 6 states. The (conserved) quantity of information is proportional to ##ln(6)##.

    We can have maximum knowledge, such as the number facing up is 2.
    We can have zero knowledge , such as I have no idea which side is up.
    We can have partial knowledge, such as I know it is 1, or 2 or 3, but not 4, or 5 or 6.
    When knowledge is maximum, entropy is minimum.
    When knowledge is minimum, entropy is maximum.
    We can assign numbers to all of this, but other people might number them differently.
    The only well defined number in this case that we should all agree on is 6, for 6 possible states.

    So even with quantum systems, the key number is the number of possible states. One of the difficulties is that the quantum systems don't have to get very big before the number of possible states becomes very large. For example, I couldn't even guess the number of possible quantum states in a single modest Carbon-12 atom. Including all the electrons, nucleons, quarks and gluons, the number of quantum states would be large. Perhaps a PF member more expert than I could give some sample numbers.
     
  6. Mar 11, 2017 #5

    BillTre

    User Avatar
    Science Advisor

    Going with the dice example (6 states):

    Not knowing which state the system is in; minimum knowledge:
    Knowing it is in only 1 of the 6 possible states:
    Knowing it is one of 3 of the possible 6 states; half max knowledge:

    What would be the units of knowledge?

    I was under the impression that entropy went as an inverse log function of information content.
    This seems to be saying that (in this case) it is more like an inverse function of information content (meaning number of possible states) minus what is known of the system (the knowledge of which states were actually there).
    So not strictly tied in information?
     
  7. Mar 11, 2017 #6

    anorlunda

    Staff: Mentor

    There are 18 definitions of entropy in different contexts. So let's not get into the formulas for measuring it.

    But in this case the die, yes. You can say that entropy measures the unknown. It does have an inverse relationship to knowledge.

    Putting units on knowledge is difficult and context dependent. I think it is better to stick with the concepts and to forget putting numbers and units on it. The closest thing to a universal unit is the bit.
     
  8. Mar 12, 2017 #7
    I know about unitarity. But I'm talking here of the collapse of the wavefunction after a measurement, where we lose unitarity and time reversal symmetry.
    Different systems can collapse into the same state, and one can not recover the initial state from the final state (there's no way to go backward in time).
     
  9. Mar 12, 2017 #8

    anorlunda

    Staff: Mentor

    We never lose unitarity. If we did, information would not be conserved.

    Remember also that wave functions collapse only in some of the QM interpretations. I chose to ignore all interpretations until the day that one of them is proved correct. Therefore, I say what collapse?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Information Conservation
Loading...