# Boltzman brains

I know this has come up before but there is still something puzzling me about the whole BB paradox. This is the problem phrased in the context of our evolving universe. As far we can tell the universe will expand forever ( assuming dark energy is not something variable but is a constant). In a system that lasts forever its said that BB's can spontaneously fluctuate out of the vacuum and that this rule out our universe being such a fluctuation because its easier to get a brain than a universe from a fluctuation. Since we dot think we are BB then this can't be how our universe formed.
But if the universe is infinite into the future then both BB and universes will both fluctuate an infinite amount of times, so where is the justification that one will be more frequent than the other?They will both be infinite.

Simon Bridge
Homework Helper
The argument is about proportions, not numbers. If something is 10x more likely that something else, then there will be 10x more of it than the other thing on large scales ... like across infinite expanse. If there are 99x more rocks than tress, then even if there are an infinite number of rocks and an infinite number of trees, the chance of picking something at random and finding it is a tree is still 1%.

The argument does not say that no universes could have emerged from the surrounding metaverse but that we are massively unlikely, collectively, to be one: given 6-7billion known intellegences, some (almost all of them) should be bolzman brains but none are. I fact we have good reason to believe that we are not produced by a random fluctuation and the odds against all that evidence having just popped out of nowhere a few seconds ago is... well... it's not strictly impossible, and we'd have no way to tell if it had. But... Occam's razor.

Another look ... and a statement about what it is supposed to disprove.
http://blogs.discovermagazine.com/c...ard-feynman-on-boltzmann-brains/#.V0hnKrp97VM

The argument is about proportions, not numbers. If something is 10x more likely that something else, then there will be 10x more of it than the other thing on large scales ... like across infinite expanse. If there are 99x more rocks than tress, then even if there are an infinite number of rocks and an infinite number of trees, the chance of picking something at random and finding it is a tree is still 1%.

The argument does not say that no universes could have emerged from the surrounding metaverse but that we are massively unlikely, collectively, to be one: given 6-7billion known intellegences, some (almost all of them) should be bolzman brains but none are. I fact we have good reason to believe that we are not produced by a random fluctuation and the odds against all that evidence having just popped out of nowhere a few seconds ago is... well... it's not strictly impossible, and we'd have no way to tell if it had. But... Occam's razor.

Another look ... and a statement about what it is supposed to disprove.
http://blogs.discovermagazine.com/c...ard-feynman-on-boltzmann-brains/#.V0hnKrp97VM

Thanks Simon, that helps a lot, but I still have some confusion. The BB problem is often phrased in terms of a thermal fluctuation from equilibrium but its also phrased in the context of eternal inflation. In this paper we see the following phrase: 'In an eternally inﬂating multiverse, the numbers of normal observers and Boltzmann brains produced over the course of eternal inﬂation are both inﬁnite. They can be meaningfully compared only after one adopts some prescription to regulate the inﬁnities".
http://arxiv.org/pdf/0808.3778v3.pdf
That seems to be in conflict with what I hear from others like Sean Caroll that the BB must outnumber the normal observers in a stasitical fluctutaion. Although I note that elsewhere he has said the idea that BB outnumber the normals is a model dependant statement. They seem to be saying you can only take that statement after you introduce some measure, but normally we hear that's its inevitable and no mention of introducing a measure is given. So it seems that the two statements are inconsistent. Where am I going wrong?

Simon Bridge
Homework Helper
Yeah - you are intrinsically making an assumption about the relative likelyhood of BB's vs non-BB universe brains. It may be that the simple initial conditions for a Universe that can give rise to brains, by physical laws leading to life and evolution by natural selection, is much more likely than a BB coming into being by random fluctuation. In fact: make the fluctuation quantum mechanical and ifaik that's the dominant theory.

Having done that, made some assumption about the relative likelyhoods, it is the probabilities that are important - giving rise to ratios of populations, so the absolute numbers of individuals, infinite or otherwise, is irrelevant to the argument. All the authors are noting is that you cannot divide infinity by infinity. That's fine, mathematicians have been dealing with that for ages.

windy miller
Chalnoth
I fact we have good reason to believe that we are not produced by a random fluctuation and the odds against all that evidence having just popped out of nowhere a few seconds ago is... well... it's not strictly impossible, and we'd have no way to tell if it had. But... Occam's razor.
To be pedantic, it's not random fluctuations in general that are the problem here, but specific kinds of them. It's possible to come up with models that form a universe like our own via random fluctuations that don't have the Boltzmann Brain problem.

windy miller