craigi said:
I'm going to try to be brief here:
1) Your text doesn't match your maths.
It's regarding this:
.Scott said:
My question was about how the different probabilities on an event affect the amount of information (or entropy) that is added to a world. With little thought, the answer is pretty clear. The number of new bits added to the world will be log base 2 of the inverse of the probability of the event. So, for example, if its a 1 in 32 chance, the amount of added information will be 5 bits.
So here's the equation with the exception:
I: the information added to the world
p: the probability of the event the created the world
I = -\frac{\ln p}{ \ln 2 } ... but only if you're sure of the time.
From the "1 in 32" example, p = 1/32, 1/p = 32, log base 2 of 32 is ln(32)/ln(2) = 5.
When I wrote the equation, I converted the log base 2 to "ln" just to make it better match up with the entropy/information conversion terms.
craigi said:
2) How do you feel about your equation giving a non integer for the number of bits?
That was a misconception on my part. I retracted it at post #130.
craigi said:
3) Where did your logarithm come from? You seem to suggest it just came to you, rather than through any deductive reasoning. It looks like you've just borrowed it from something you picked up from elsewhere, without understanding it.
If you have "b" bits, you can describe 2^b states.
So if you have n states, you will need log base 2 of n bits to encode them - not worrying for the moment about whether that yields an integer number of bits.
Now if all of the states are of equal probability (1/n), that is as far as you can go. But what if there are three states with p = 50%, 25%, and 25%. Well those 25% are just like 1 of 4 choices and will require 2 bits while the 50% will require only 1. So what really matters is the probability, not the number of choices.
I could be more formal in the math, but it would be more work and I don't thing it would make it more convincing.
craigi said:
5) Wherever got it from, you've missed the normalisation term.
I don't understand.
craigi said:
6) The choice of base for the logarithm is arbitary.
The Berekstein bound is expressed in bits (base 2). An arbitrary choice, but I like it.
craigi said:
7) Why do feel the need to invoke an observer measurement of time to prevent your equation becoming invalid?
That's really the crux of the issue - and the part that I think is really cool for MWI.
I don't feel it, for a universe with a Berekstein Bound, I know it.
It's really a two-part argument. But let me give you part one for now.
We can use the universe as a whole as our base line. So when all the worlds are taken as a whole, we are at the "base information" or "no information" state. We will also presume a time-zero (T0) for the universe, a time before any decoherence event has happened. At that moment, all possible worlds are possible so at that moment we have "no information". That moment doesn't really need to exist, but it makes for a simpler picture. In fact, part 2 of the argument involves getting rid of T0.
So from T0, we allow time to run forward and for entropy to steadily increase. As we do this, we get a more and more diverse collection of worlds. From the MWI point of view, because we are increasing the diversity of the worlds, the entropy is increasing. But as we approach maximum entropy, we start to run out of diversity. It's kind of like the lottery. If the jackpot is only $1M and you win, you will probably discover that your ticket is the only one with the winning number. But is the jackpot reaches $1B and you win, you will probably have to share it with other ticket holders who chose the same number. So when several worlds picket the same state as one of their "tickets", you have a new world with no unambiguous history. You also have less increase in entropy. In fact, if one of the contributors to your new shared world is part of your history, you will have, in effect, skipped back through time - loosing entropy.
Now those are the characteristics of "heat death". But what does it look like from within the world experiencing it? How would you fight it? Well, you would try to keep good time - so if you suddenly skipped back or forth through time you'd have a clock that told you what happened. But clocks do not last forever. They eventually blend in with the heat death. All time-keeping mechanisms, every one of them, will eventually be gone. Then we will drift into worlds where decoherence will have no meaningful consequence on entropy. All of the worlds states will have been generated and time will simply cause its world to move into one of several other already existing worlds.
craigi said:
8) You still seem to think this is relevant to the MWI, but this is just entropy generated at decoherence.
I'm not saying that the tail-off in entropy generation can't be explained with other interpretations. Only that I find MWI interesting because it makes it easy to see what happens as heat death approaches.
craigi said:
9) When you talk about Beckenstein bounds you ignore other sources of entropy.
I'm not sure I understand. In all cases, I am presuming that we are constrained to a Bekenstein Bound - although small leaks that cross the bound would not change the overall picture.
craigi said:
10) Entropy has an actual definition in physics, involving the Boltzmann constant.
Yup. And that is the one that I am talking about.
I am really saying that there can be world conditions where decoherence will not result in a net increase in entropy.