Does the Wavefunction Collapse Violate Information Conservation?

In summary, it is believed that systems in distinct states evolve into distinct states, so that from a current state we can deduce uniquely the previous state. When the wave function collapse, information is lost, isn't it?. So, does the wavefunction collapse violate the conservation of information?If we have a quantum system with n states, then unitarity requires that time evolutions keep the number of states as n; never more than n or less than n. In that sense, it is the quantity of information that is conserved.
  • #1
eltodesukane
97
19
I have read that "Information is conserved", or at least that it should be.
I read this in some discussion about whether or not black holes destroy information.
(ref: Leonard Susskind, Stephen Hawking, The Black Hole War, etc..)
It is believed systems in distinct states evolve into distinct states,
so that from a current state we can deduce uniquely the previous state.

OK. But how does this fit with the wavefunction collapse in quantum mechanics?
When the wave function collapse, information is lost, isn't it?.
If a particle's spin is measured and collapse along z+, we can not from z+ deduce the previous orientation.

So, does the wavefunction collapse violate the conservation of information?
 
Physics news on Phys.org
  • #2
If we have a quantum system with n states, then unitarity requires that time evolutions keep the number of states as n; never more than n or less than n. In that sense, it is the quantity of information that is conserved.

Knowing which state it is actually in, should be called knowledge, not information. Knowledge can be gained or lost.

Another way to state conservation of information is that the sum of the probabilities of all possibilities must add up to exactly 1, never more, never less.

It appears to me that most confusion on this topic comes from dictionary definitions that say information is a synonym for knowledge. In this context, they are not the same thing.
 
  • Like
Likes BillTre and Dale
  • #3
anorlunda said:
Knowing which state it is actually in, should be called knowledge, not information. Knowledge can be gained or lost.
So, is there more to a definition of knowledge WRT these systems?

Can one put a number on knowledge, in some way, as with information and entropy?
 
  • #4
BillTre said:
So, is there more to a definition of knowledge WRT these systems?

Can one put a number on knowledge, in some way, as with information and entropy?

My Professor, Leonard Susskind, like to use a six-sided die (one out of of a pair of dice) as an example.

The die can have 6 states. The (conserved) quantity of information is proportional to ##ln(6)##.

We can have maximum knowledge, such as the number facing up is 2.
We can have zero knowledge , such as I have no idea which side is up.
We can have partial knowledge, such as I know it is 1, or 2 or 3, but not 4, or 5 or 6.
When knowledge is maximum, entropy is minimum.
When knowledge is minimum, entropy is maximum.
We can assign numbers to all of this, but other people might number them differently.
The only well defined number in this case that we should all agree on is 6, for 6 possible states.

So even with quantum systems, the key number is the number of possible states. One of the difficulties is that the quantum systems don't have to get very big before the number of possible states becomes very large. For example, I couldn't even guess the number of possible quantum states in a single modest Carbon-12 atom. Including all the electrons, nucleons, quarks and gluons, the number of quantum states would be large. Perhaps a PF member more expert than I could give some sample numbers.
 
  • #5
Going with the dice example (6 states):

Not knowing which state the system is in; minimum knowledge:
anorlunda said:
When knowledge is minimum, entropy is maximum.

Knowing it is in only 1 of the 6 possible states:
anorlunda said:
When knowledge is maximum, entropy is minimum.

Knowing it is one of 3 of the possible 6 states; half max knowledge:

What would be the units of knowledge?

I was under the impression that entropy went as an inverse log function of information content.
This seems to be saying that (in this case) it is more like an inverse function of information content (meaning number of possible states) minus what is known of the system (the knowledge of which states were actually there).
So not strictly tied in information?
 
  • #6
There are 18 definitions of entropy in different contexts. So let's not get into the formulas for measuring it.

But in this case the die, yes. You can say that entropy measures the unknown. It does have an inverse relationship to knowledge.

Putting units on knowledge is difficult and context dependent. I think it is better to stick with the concepts and to forget putting numbers and units on it. The closest thing to a universal unit is the bit.
 
  • #7
anorlunda said:
If we have a quantum system with n states, then unitarity requires that time evolutions keep the number of states as n; never more than n or less than n. In that sense, it is the quantity of information that is conserved.
I know about unitarity. But I'm talking here of the collapse of the wavefunction after a measurement, where we lose unitarity and time reversal symmetry.
Different systems can collapse into the same state, and one can not recover the initial state from the final state (there's no way to go backward in time).
 
  • #8
We never lose unitarity. If we did, information would not be conserved.

Remember also that wave functions collapse only in some of the QM interpretations. I chose to ignore all interpretations until the day that one of them is proved correct. Therefore, I say what collapse?
 

What is information conservation?

Information conservation is the principle that states that the total amount of information in a closed system remains constant over time. This means that information cannot be created or destroyed, but it can be transformed or transferred between different forms.

Why is information conservation important?

Information conservation is important because it helps us understand how information is created, stored, and transferred within systems. It also plays a crucial role in fields such as physics, computer science, and information theory.

How is information conserved in physics?

In physics, information conservation is closely linked to the laws of thermodynamics. The first law states that energy cannot be created or destroyed, and since information is a form of energy, it also cannot be created or destroyed. The second law states that the total entropy of a closed system will always increase or remain constant, which also applies to information.

What is the relationship between information conservation and entropy?

Entropy is a measure of the disorder or randomness in a system. As information is transformed or transferred, the entropy of a system may increase or decrease, but the total amount of information remains constant. This means that information conservation and entropy are closely related concepts.

How does information conservation apply to data storage and transfer?

In data storage and transfer, information conservation is important to ensure that data is not lost or corrupted. This is why methods such as error-correcting codes and redundant data storage are used to protect against information loss. Information conservation also plays a role in data compression and encryption, where the original information is transformed into a different form while still maintaining its total amount.

Similar threads

  • Quantum Physics
4
Replies
129
Views
9K
  • Quantum Physics
5
Replies
149
Views
10K
  • Special and General Relativity
Replies
11
Views
623
  • Beyond the Standard Models
Replies
4
Views
2K
Replies
3
Views
1K
  • Special and General Relativity
Replies
6
Views
985
  • Mechanics
Replies
1
Views
2K
  • Mechanics
Replies
16
Views
5K
  • Quantum Physics
Replies
17
Views
2K
Back
Top