Many-worlds: When does the universe split?

Click For Summary
The discussion centers on the many-worlds interpretation (MWI) of quantum mechanics, specifically questioning when universe splitting occurs and the role of superposition. Two main alternatives are presented: one suggests superposition exists in a single universe until measurement causes a split, while the other posits that multiple universes exist from the start, revealing themselves upon measurement. Participants express skepticism about both alternatives, finding them implausible and questioning the explanatory power of MWI regarding observed probabilities. The conversation also highlights that decoherence is a measurable feature of quantum mechanics, not exclusive to MWI, and emphasizes the deterministic nature of the Schrödinger equation in the evolution of the wave function. Ultimately, the dialogue reflects a broader uncertainty about the implications and testability of MWI in explaining quantum phenomena.
  • #91
kith said:
You still don't address the issue that this classical information is erased by the next measurement.
The information I am talking about is not specific to the particle. I am not saying that the particle itself holds more information. I am saying that once a measurement is made of any truly "random" event (which would include the selection of an MWI world), there is more total information in the world.

And from everything that I can understand, this is exactly the sort of "physical information" that is suppose to be indestructible. I do need to continue looking at exactly what is meant by quantum information destruction. It's basically two different quantum states evolving into the same state - presumably not possible. But I am still looking and studying.
 
Physics news on Phys.org
  • #92
kith said:
You still don't address the issue that by obtaining this classical information you are erasing the information you previously had - namely the spin in another direction.
You go from 4 possibilities to 1. From Spin A+, A-, B+, B- to only one of them. With all of the other possibilities eliminated. What is important is not that you are loosing Spin A information when you measure along the B axis, but that the result of the Spin B measurement itself will be random. Of course, if Spin B has already been measure on an entangled particle, then it won't be random.
The real question is whether it is ever truly random.
 
  • #93
Bill is right: it is overdue that you give a definition of "information".
 
  • #94
.Scott said:
You go from 4 possibilities to 1. From Spin A+, A-, B+, B- to only one of them.
What are A and B?
 
  • #95
.Scott said:
I know that "virtual particles" most often refer to those flighty things that pop up in a vacuum, but I believe they can also refer to the different paths a photon can follow before it is finally detected. In any case, "virtual" or not, these "construction" photons are commonly used in the Physics literature. If you wish, I will call them "construction" photons or anything else. But I believe "virtual" is a correct term for them.

Nobody with a clue would call different interfering paths "particles" and specifically not "virtual particles". They're also not "constructing photons" or anything like that. They're just different interfering histories, if you want to enforce such a view. But it suggests that there is some classical reality underneath, so I consider it best to avoid that picture outside of perturbation calculations entirely. What happens is simply a unitary evolution of the quantum state, nothing else.

Cheers,

Jazz
 
  • #96
.Scott said:
There are different answers for different basic theories. But let me give an example that follows the most vanilla, well-accepted scenario. This specific scenario will hold through to the end of this post:

You are about to measure the spin of a particle along a specific axis. QM predicts that the result of the experiment will be 50% chance of up, 50% chance of down - and there is nothing in the universe that will tell you which will happen - either practically or even in principle.

So that is the situation in my W0. At this point you have a 2-page document with no information about which page you will end up on.

Then you make the measurement, it is either up or down. You are still in the same universe with 2-pages, but now you know which page you are on. This was information that was non-existent in W0. So the information is simply which choice you end up living in. So you ask "where do you see the increase in information". My answer is that it that it can be very apparent at the macroscopic scale, it's the result of a measurement that could not be predicted before being made.

To the very limited extent that I understand the math, it's easy to see it there as well. If you have an equation that yields the set of all integers, that is less information than that same equation restricted to any particular choice. A quadratic equation that yields -4 and 6 as results is less information than the same equation restricted to either x>0 or x<0. As time moves on, our universe becomes more and more specific.

But shouldn't that show up as a world that is somehow growing? What does a particle with extra information look like? I don't know.

Are you aware of the Second Law of Thermodynamics?

Decoherence involves thermodynamically irreversible processes. This involves a loss of order and an increase in entropy as interference spreads to the surrounding environment and as such would require an increased amount of information to represent.

This entropy growth is nothing new. The understanding of it predates quantum mechanics by 100's of years.

The entropy growth involved in QM is a direct result of the formulation and is not interpretation dependent. It is the observer dependent reality that contains all relevant information. If you wish to consider the entropy in the MWI multiverse then I'd argue that it is zero and contains no information.

Regarding the enumeration of other possible universes as a measure of information content. The MWI doesn't differ in that respect. Those other possibilties are still there, the difference is that the MWI says they are inaccessible rather than just hypothetical.

When you refer to the "conservation of information". I think you're referring to the quantum no-deleting theorum. This is specific to quantum information stored in qubits, for instance. The classical world doesn't conserve information, otherwise it would break the second law of thermodynamics. A qubit can yield no more than one classical bit of information.

When entropy increases, a particle doesn't store extra information. The classic example is if you had n particles of type A in box C and n particles of type B in box D, then put them all in box E. The entropy has increased. The computing analog of this is n 0's followed by n 1's contains much less information than if they're all randomly ordered. The information stored can be measured by the theoretical minimum for the compressed size of those bits. The maths of information entropy is almost identical to that of thermodynamic entropy. They are effectively the same thing.
 
Last edited:
  • #97
.Scott said:
The number of points on a line segment ant the amount of information in the universe are completely different topics. The cardinality of points on a line or line segment is Aleph 1.

Sorry. But it is the SAME topic.

When a wavefunction is partitioned by decoherence because we are dealing with complex numbers it has exactly the same cardinality as the real line. Each world contains an infinite amount of information just like when you spit a real interval into two - each subinterval contains exactly the same information as the original interval. Its one of the screwy things about infinity.

Your book analogy is incorrect. Because we are dealing with a complex vector space your book effectively contains infinite pages. Divide such a book any way you like say in half and each half contains an infinite number of pages. It is this attribute that allows the partitioning of the universal wavefunction to continue indefinitely without altering the information in each subworld which is infinite. infinity/n = infinity.

Thanks
Bill
 
  • #98
Maui said:
For all practical purposes the length of the line determines how much information can be encoded upon it(there is always a practical limit). That is unless one considers the universe to be a mathematical object created by some mathematician but that'd be philosophy.

That is WAY wrong. A real or complex valued function encodes an infinite amount of information. That is why the partitioning of it can continue indefinitely and each sub partition contains an infinite amount of information.

QM is a MATEMATICAL MODEL and as such is exactly that - it contains mathematical objects created by mathematicians with whacky counter intuitive properties.

Thanks
Bill
 
  • #99
.Scott said:
The information I am talking about is not specific to the particle. I am not saying that the particle itself holds more information. I am saying that once a measurement is made of any truly "random" event (which would include the selection of an MWI world), there is more total information in the world.

What we are saying is that is WRONG. The information encoded in the complex valued wavefunction (or more correctly state) of the universe is infinite. When that wavefunction is 'partitioned' by decoherence each world also contains infinite information. This can continue indefinitely without bound with no information being gained or lost.

Thanks
Bill
 
  • #100
craigi said:
They are effectively the same thing.

Which of course was one of Shannon's great discoveries. Its basically a measure of order. Chaos is coming - the universe is tending to disorder.

But I suspect Scott is not looking at it that way. For some reason he is thinking that with each observation information increases. He fails to understand in the MWI the information in a state is a useless concept because its not information that 'splits' - but the state.

Thanks
Bill
 
  • #101
bhobba said:
Which of course was one of Shannon's great discoveries. Its basically a measure of order. Chaos is coming - the universe is tending to disorder.

But I suspect Scott is not looking at it that way. For some reason he is thinking that with each observation information increases. He fails to understand in the MWI the information in a state is a useless concept because its not information that 'splits' - but the state.

Thanks
Bill

It does, doesn't it?

A measurement causes decoherence, which causes the interference to get dissipated into the environment, which causes an increase in entropy.

So in that respect he's right.
 
  • #102
craigi said:
It does, doesn't it?

:biggrin::biggrin::biggrin::biggrin::biggrin::biggrin::biggrin:

Of course - in the context you are considering - but that is not the context he is thinking of.

In the MWI the wave-function gets partitioned by decoherence which can be viewed as a loss of information contained in the phase of a superposition - it gets jumbled up with the environment in a rough intuitive sort of way. But that doesn't change the information contained in the overall universal wave-function, which is not a useful concept in this context, doesn't change one whit.

Kith hit the nail on the head - he needs to give his definition of information and why it increases.

Even classically our knowledge of say the position of a particle is constantly increasing so in that sense information is increasing - but it is of zero relevance.

What is relevant to entropy is the configurations of particles that show no order is overwhelmingly greater that those that do - that's why entropy increases. Its not impossible for the particles of two gasses to suddenly separate - its that if you consider any configuration the number where they are randomly intermingled is overwhelmingly greater than if they are separate.

Although I have only given the analogy a superficial consideration I suspect it's exactly the same with decoherence - that information is lost to the environment is simply that is the much more likely scenario.

In MWI there would be worlds where gasses separated but they would be vastly outnumbered by what we usually experience.

Thanks
Bill
 
Last edited:
  • #103
bhobba said:
:biggrin::biggrin::biggrin::biggrin::biggrin::biggrin::biggrin:

Of course - in the context you are considering - but that is not the context he is thinking of.

Kith hit the nail on the head - he needs to give his definition of information and why it increases.

I think in a roundabout way, he's actually right, if we use the defintion of information as entropy. He's of course, ignoring a formal defintion, but splitting does cause an increase in information (entropy), in the observer's reality.

Where I think he's wrong is in thinking that it's somehow special to the MWI. To me, the entropy increase at decoherence events is just a product of the second law.
 
Last edited:
  • #104
Well well well...

I did a quick search on this and look what I found:

Andrew Thomas said:
http://www.ipod.org.uk/reality/reality_decoherence.asp

Regarding change of entropy, in the Many Worlds interpretation, entropy increases after each universe-branching operation (the resultant universes being slightly more disordered). So Many Worlds **causes** an increase in entropy. But in the explanation of decoherence I presented in the main article, the increase in entropy due to the Second Law **causes** decoherence. So that is far more preferable to the Many Worlds cause/effect sequence. And decoherence is explained by an existing physical principle: the second law of thermodynamics.

...

The more I think about the Many Worlds interpretation, the less sense it makes.

Andrew Thomas, 27th October 2008

It seems that he has struggled with this concept too. I don't understand where his confusion lies though. Again, I just don't see why the MWI is considered different to the CI, in this respect. Reading the rest of this page, he does seem to be a strong opponent of the MWI and a proponent of physical collapse.
 
Last edited by a moderator:
  • #105
craigi said:
I did a quick search on this and look what I found:

I think its pretty obvious entropy increases - if that what was being said - no problem.

But what he is claiming is that information increases and must eventually exceed the Beckenstein bound and that information is somehow increasing because of 'observations'.

First of all we need to see his definition of information in a quantum system and that it increases without bound.

Thanks
Bill
 
  • #106
bhobba said:
That is WAY wrong. A real or complex valued function encodes an infinite amount of information. That is why the partitioning of it can continue indefinitely and each sub partition contains an infinite amount of information.

QM is a MATEMATICAL MODEL and as such is exactly that - it contains mathematical objects created by mathematicians with whacky counter intuitive properties.

Thanks
Bill



That is because you are so deeply involved in your mathematical model that you seem to forget that:


1. QM is just a model
2. it's not the real world out there(the real world is actually not described by qm as it is - major flaw)
3. your considerations are practically wrong because of the Planck length limit. You cannot divide length further than the limit allows.

You are basically equating mathematics with physics and in the case you were presenting, the experimental confirmation of your theory(that a line's length is infinitely divisible) fails at the Planck length and that is all physics cares about.

You do understand that a real interval of any length contains exactly the same information as a real interval of any other length? Split any length in two and each part contains exactly the same information as before it was split?



If this quote was about quantum information and I misunderstood your idea(it's not obvious to me that it is so), I retract my statements about the practical limit on the amount of information that can be encoded in a given volume/area.
 
Last edited:
  • #107
An increase in entropy corresponds to a decrease of information. This is the opposite of .Scotts assertions.

This thread is not productive because he's focusing on the collapse part (which he calls "which world information") and ignores decoherence, while most of the others talk about decoherence only.
 
  • #108
kith said:
What are A and B?
There two orthogonal measurements. One disappears when you measure the other.
 
  • #109
.Scott said:
There two orthogonal measurements. One disappears when you measure the other.
Okay, so initially we have a superposition in the A direction and don't know whether the spin is A+ or A- but we know it is B+. After the measurement in the A direction, we know it is -say- A- but not whether it is B+ or B-. Where's the net information increase?
 
  • #110
kith said:
Okay, so initially we have a superposition in the A direction and don't know whether the spin is A+ or A- but we know it is B+. After the measurement in the A direction, we know it is -say- A- but not whether it is B+ or B-. Where's the net information increase?

It's in the measuring apparatus. You can't make a measurement without it. It's not the measured particle that stores the information increase but the macroscopic system, that is used to measure it. The superposition terms are very quickly and irreversibly, dissipated through the extensive coupling causing a small amount of disorder, which takes an increased amount of information to represent.
 
Last edited:
  • #111
bhobba said:
.Scott said:
The information I am talking about is not specific to the particle. I am not saying that the particle itself holds more information. I am saying that once a measurement is made of any truly "random" event (which would include the selection of an MWI world), there is more total information in the world.
What we are saying is that is WRONG. The information encoded in the complex valued wavefunction (or more correctly state) of the universe is infinite. When that wavefunction is 'partitioned' by decoherence each world also contains infinite information. This can continue indefinitely without bound with no information being gained or lost.

Thanks
Bill

There is a Berekstein Bound that limits the amount of information based on energy (E) and radius (R):
S \leq \frac{2 \pi k R E}{\hbar c} (thanx Wikipedia)
All of those other symbols are constants except for the E and R.

Apparently there is a direct conversion from Entropy (S) to Bits of information (I):
I = S \frac{k}{ \ln 2 }

So: I \leq \frac{2 \pi R E}{\hbar c \ln 2}

What makes the universe different from a line is the Planck distance.

kith said:
An increase in entropy corresponds to a decrease of information. This is the opposite of .Scotts assertions.
Please note the equation above:
I = S \frac{k}{ \ln 2 }
 
Last edited:
  • #112
craigi said:
kith said:
Okay, so initially we have a superposition in the A direction and don't know whether the spin is A+ or A- but we know it is B+. After the measurement in the A direction, we know it is -say- A- but not whether it is B+ or B-. Where's the net information increase?
It's in the environment. You can't make a measurement without it. It's not the measured particle that stores the information increase but the macroscopic system and its coupling to the external environment, that is used to measure it.
Yes!
 
  • #113
Maui said:
1. QM is just a model

And yet it makes predictions remarkably consistent with experiment.

Maui said:
2. it's not the real world out there(the real world is actually not described by qm as it is - major flaw)

The real world not described by QM? We have zero evidence for that.

Maui said:
3. your considerations are practically wrong because of the Planck length limit. You cannot divide length further than the limit allows.

And you know that's because QM is wrong exactly how? We know our current theories break down there but if that's because QM is wrong is far from certain. In fact the best current candidate for doing exactly that, String Theory, is a Quantum Theory

Maui said:
You are basically equating mathematics with physics and in the case you were presenting, the experimental confirmation of your theory(that a line's length is infinitely divisible) fails at the Planck length and that is all physics cares about.

Our current theories break down at the Plank length. If that is because the continuity assumption fails is unknown.

BTW I am not saying that mathematics is physics. I am saying exactly what Feynman said:


Physics is not mathematics but is written in the language of mathematics. You can't pick and choose what aspects of that language to ignore or not.

Thanks
Bill
 
Last edited by a moderator:
  • #114
craigi said:
Are you aware of the Second Law of Thermodynamics?
Decoherence involves thermodynamically irreversible processes. This involves a loss of order and an increase in entropy as interference spreads to the surrounding environment and as such would require an increased amount of information to represent.

This entropy growth is nothing new. The understanding of it predates quantum mechanics by 100's of years.

The entropy growth involved in QM is a direct result of the formulation and is not interpretation dependent. It is the observer dependent reality that contains all relevant information. If you wish to consider the entropy in the MWI multiverse then I'd argue that it is zero and contains no information.
I wasn't sure about all of this, but I have always kept open the option of the infomation continuously increasing. The only tie to MWI is that MWI clearly creates new information where as other views are consistent with no new information.

Correct me if I'm wrong, but we can calculate the increase in entropy for any decoherence. That being the case, we should be able to count the number worlds created in such events:

S: increase in entropy
I: increase number of bits
W: number of worlds after event

I = S \frac{k}{ \ln 2 }

W = {2}^{I}

Someone will have to tell me if this always works out to an integer. If it doesn't, I'm not bothered with fractional worlds if you're not.

Also, what will stop entropy from increasing? I know a black hole does. Put simply, time stops at the event horizon putting a cap on information inflation. But if the universe is not destine to freeze onto the surface of a black hole, will anything stop entropy?

If not, we have this continuous increase in entropy perhaps requiring a continuous increase in either the mass or radius of the universe - so as not to exceed the Berekstein bound.

Of course, as was pointed out earlier in this thread, it's possible that worlds will begin to merge as they run out of mass and room to remain different.
 
Last edited:
  • #115
craigi said:
You can't make a measurement without it.

For theories that use decoherence it seems generally assumed a measurement is when decoherence occurs. For example a few stray photons is enough to decohere dust particles and give it an effective position. That is generally considered a 'measurement' even though its not associated with an actual observational apparatus. In fact the VAST majority of 'measurements' are like this rather than the formal apparatus type.

I also thought that formally Von Neumann entropy was the same as Shannon entropy and an increase in either represents an increase in randomness. Highly random 'messages' contain no information so that Kith is correct - information content decreases with increasing entropy. However highly random 'messages' require the greatest amount of information to encode - so in that sense information increases. I am unsure what sense is meant here.

Thanks
Bill
 
Last edited:
  • #116
bhobba said:
The real world not described by QM? We have zero evidence for that.
If you mean that QM presents a non-contextual, objective, local and realistic world(as is known to scientific models and experience) you really couldn't be more wrong.
And you know that's because QM is wrong exactly how?
Where did I say that? I said that mathematics is not the world out there and qm does not represent the world as experienced(certainly not without the addition of fairy tales and personal beliefs).
Our current theories break down at the Plank length. If that is because the continuity assumption fails is unknown.
The fact is you cannot divide physical lengths an infinite number of times.
 
  • #117
.Scott said:
I wasn't sure about all of this, but I have always kept open the option of the infomation continuously increasing. The only tie to MWI is that MWI clearly creates new information where as other views are consistent with no new information.

Correct me if I'm wrong, but we can calculate the increase in entropy for any decoherence. That being the case, we should be able to count the number worlds created in such events:

S: increase in entropy
I: increase number of bits
W: number of worlds after event

I = S \frac{k}{ \ln 2 }

W = {2}^{I}

Someone will have to tell me if this always works out to an integer. If it doesn't, I'm not bothered with fractional worlds if you're not.

Also, what will stop entropy from increasing? I know a black hole does. Put simply, time stops at the event horizon putting a cap on information inflation. But if the universe is not destine to freeze onto the surface of a black hole, will anything stop entropy?

If not, we have this continuous increase in entropy perhaps requiring a continuous increase in either the mass or radius of the universe - so as not to exceed the Berekstein bound.

Of course, as was pointed out earlier in this thread, it's possible that worlds will begin to merge as they run out of mass and room to remain different.

So, since we know the increasOne this we can do is count the number of worlds created in any
But this does lead us into other problems.

Nope. Decoherence isn't special to the MWI. It's part of the QM formulation. Entropy increases at 'measurement' events regardless of interpretation and by the same amount.

The other remaining issue of the Beckenstein bound prohibiting the entropy growth is explained by understanding that the entropy is dissipated spatially. If we have a localised particle that is subject to constant alternating measurements it doesn't increase entropy at it's location. The entropy is contained within the measuring device. Your next thought might be that even the measuring device can't store the entropy if the measurements continue indefinitely. You'd be right, but the measuring apparatus is coupled to its external environment and dissipates entropy to the rest of the universe.

Ulitmately, one possibility is the heat death of the universe where the universe would become so disordered that your measuring apparatus would no longer be able to function, statistically speaking.
 
Last edited:
  • #118
Maui said:
If you mean that QM presents a non-contextual, objective, local and realistic world(as is known to scientific models and experience) you really couldn't be more wrong.

Why you think I meant that has me beat. Its dead simple. We do not have a TOE - if that theory is based on QM or not is unknowen.

Maui said:
Where did I say that? I said that mathematics is not the world out there and qm does not represent the world as experienced(certainly not without the addition of fairy tales and personal beliefs).

Gee mate - so your issue is we only describe the world and that is not the same as the world itself. Wow - what an Earth shattering observation.

Maui said:
The fact is you cannot divide physical lengths an infinite number of times.

You might find it illuminating to actually describe, define, or whatever words you want to use, what a length is without invoking the concepts of geometry and real number.

Thanks
Bill
 
  • #119
craigi said:
Ultimately, one possibility is the heat death of the universe where the universe would become so disordered that your measuring apparatus would no longer be able to function, statistically speaking.
Is there some feature of heat death that would stop occurrences of decoherence?
 
  • #120
bhobba said:
For theories that use decoherence it seems generally assumed a measurement is when decoherence occurs. For example a few stray photons is enough to decohere a dust particles and give it an effective position. That is generally considered a 'measurement' even though its not associated with an actual observational apparatus. In fact the VAST majority of 'measurements' are like this rather than the formal apparatus type.

I also thought that formally Von Neumann entropy was the same as Shannon entropy and an increase in either represents an increase in randomness. Highly random 'messages' contain no information so that Kith is correct - information content decreases with increasing entropy. However highly random 'messages' require the greatest amount of information to encode - so in that sense information increases. I am unsure what sense is meant here.

Thanks
Bill

Sure, feel free to use 'measurement apparatus' and 'macroscopic system' interchangably, in what I was saying. The former seems easier to visualise, the latter, a generalisation where it holds equally well.

Regarding randomness and information. In the terminology that I'm using a random series of bits holds maximum information, whereas a series of a 0's hold minimal information. Which makes it synonymous with entropy. I don't doubt that the literature sometimes uses the term information in other contexts too. I can understand why we might say that randomness contains minimal (useful) information, but I can't see how we could argue that a series of all 0's could constitute maximum information.
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
30
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 48 ·
2
Replies
48
Views
8K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 41 ·
2
Replies
41
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K