Universe's Informational Content and the Big Bang

  • Context: Graduate 
  • Thread starter Thread starter ricardo81
  • Start date Start date
  • Tags Tags
    Big bang
Click For Summary

Discussion Overview

The discussion revolves around the informational content of the universe, specifically the claims made by Seth Lloyd regarding the maximum number of bits of information (10^90 to 10^120) that can be contained within the universe. Participants explore the implications of this idea, questioning how such a binary representation might relate to current physical theories, the quantization of space-time, and the complexity of describing elementary particles.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants suggest that translating a bit string into a physical description of the universe is contingent on how one interprets that string, raising questions about the quantization of space-time.
  • There is a query about whether a significant percentage of the 10^90 bit variations can be ruled out based on known physical laws and the nature of elementary particles.
  • One participant proposes that if the bits required to describe elementary particles could be quantified, it might reduce the complexity of the possible configurations of the universe.
  • Concerns are raised regarding the finite nature of the universe and whether the assumptions made by Lloyd's model are valid, given that the universe may not be finite.
  • References to the Holographic Principle are made, suggesting a connection between entropy and the informational content of the universe.
  • Participants discuss the role of technology and personal preferences in understanding physics, particularly in relation to the use of computers versus traditional methods of reasoning.

Areas of Agreement / Disagreement

Participants express a range of views, with some agreeing on the abstract nature of bits and their interpretation, while others contest the implications of a finite informational capacity of the universe. The discussion remains unresolved regarding the validity of Lloyd's claims and the quantization of physical phenomena.

Contextual Notes

Participants note limitations in the assumptions made about the universe's finiteness and the definitions of bits in relation to physical quantities. There is also uncertainty regarding the extent to which current physical laws can inform the translation of bits into a physical model.

ricardo81
Messages
39
Reaction score
1
It's been suggested that the Universe has no more than 10^90 (potentially up to 10^120) bits of information (Seth Lloyd, Computational Capacity of the Universe, http://arxiv.org/pdf/quant-ph/0110141v1.pdf )

As a bit string that's 10^90 arrangements of 0's and 1's. In the same document he mentions that until now, there's a maximum of 10^120 bit switching operations that could occurred in the timespan from the Big Bang to the present.

I have some queries regarding this and hopefully your better educated minds can enlighten me! I believe my questions emphasise the informational content more than the operations though I may be wrong in assuming that.

1. Is there a meaningful way, given current understanding of physics, to translate this base 2 understanding into the current understanding of physics?

2. If so, is there a rough percentage of that of these 10^90 bit string variations that can be ruled out? i.e. it'd be unlikely (perhaps) that there'd be more than 90% 0's, or 90% 1's.

3. It seems like if you could calculate the number of bits required to describe elementary particles then that would reduce complexity by many orders of magnitude. This is along the lines of "Hashlife" (http://en.wikipedia.org/wiki/Hashlife) being able to describe Game of Life evolutions more efficiently. Does quantum mechanics ruin this assumption?

4. I probably could think of more related questions, but generally I'd welcome any comment.

Thanks
 
Space news on Phys.org
A bit is an abstract concept, and translating a string of bits to a description of the universe depends entirely upon how you interpret that string of bits.

What this says is that if you had a computer that could store the exact state of the entire universe*, it would need at least 10^{90} bits of memory. This means, for example, that only integers would be stored in that computer: floating point numbers require infinite storage. So this requires that space-time be quantized in some sense, so that positions and velocities can be described with integers alone.

I don't understand your question 3. Reduce the complexity of what?

* Note: such a computer would have to be outside our universe, or else it'd have to contain a description of itself, which is impossible.
 
ricardo81 said:
It's been suggested that the Universe has no more than 10^90 (potentially up to 10^120) bits of information (Seth Lloyd, Computational Capacity of the Universe, http://arxiv.org/pdf/quant-ph/0110141v1.pdf )
One objection to this is that since it posits a finite answer, it then of necessity is using a model in which the universe it finite, but that is NOT known to be true. It MAY be true, with the universe finite but unbounded, or It may NOT be true in which case his analysis is pointless.
 
Thank you kindly for your replies

> posits a finite answer

Indeed. From my somewhat limited understanding of current physical theory, it's still in the realm of possibility, Loop Quantum Gravity being one of the descriptions to explain away the seemingly continuous nature of quantum mechanics.

> A bit is an abstract concept, and translating a string of bits to a description of the universe depends entirely upon how you interpret that string of bits.

Yes! I guess part of my question is whether matter/energy/fields have been quantified in such a way. Maybe "base 2" is flavour of the day because we're in the computer age, but it's also a fundamental base of true and false values. If all is discrete and countable then describing it in base 2 would give you the precise amount of information you need and no more (no redundancy required:) )

> So this requires that space-time be quantized in some sense

Yes.

> I don't understand your question 3. Reduce the complexity of what?

So for a random example, a proton may require 30 bits of information to explain it, and 10% of the entropy of the universe is protons, then that would potentially reduce the number of 2^90 variations. Basically... if the understood elements of the universe have known ways of interacting with one another, it would drastically reduce the possible permutations of bits in that 2^90, no?

> * Note: such a computer would have to be outside our universe, or else it'd have to contain a description of itself, which is impossible.

Apparently that's what Edward Fredkin believes, that it is 'elsewhere'. Seth Lloyd just takes the view that the universe is computing itself. I take the view that it doesn't matter in almost all cases :) Surely it's a bit bold to say it's impossible in as much as it can be compared to evolution and human beings. Perhaps better to describe it is 'inherent' rather than 'outwith'.
 
Last edited:
See the Holographic Principle. On the way to developing his contribution to the Holographic Principle, Leonard Susskind at alii found the Planck dimensions of an entropic bit.
 
Thank you Doug, you have recommended it twice for me so it certainly seems more compelling. It seems though he is a bit of a technophobe and to me it seems like the things he is saying (from what I've heard so far) are not too different to the computer analogies from the likes of Seth Lloyd. I guess it's more a "way to" think than "what to" think, but ultimately just about entropy. I've yet to read (and understand) the Holographic Principle.

I was just interested to see if there was any modelling regarding the things I mentioned above.
 
Technophobe Susskind?

Here's a link URL to his 136 lectures streamed on YouTube. They are senior to graduate level.


If you can do calculus and a bit of physics, they're utterly entertaining, rigorous reviews of physics from classical mechanics through QM and onto his String/M-theory and the development of the Holographic Principle. I particularly enjoy thinking of the HP as another method of dimensional reduction.
 
> Technophobe

Yes, every video I've watched he talks about how he prefers chalk and blackboard or how he's no good with computers. Not meaning to be rude or anything by saying it. :)
 
Technophobe; Admiral Rickover prohibited computers in his enginerooms for demanding his operators be smarter than the machinery they operated. I believe that the understanding of physical relationships is easier with chalk and wrenches. It is not technophobia, it is skepticism and not trusting some anonymous programmer.
 
  • #10
Susskind has been quoted as calling himself a technophobe. I don't think 'smart' can be applied to the context of computers today, and Church-Turing thesis is generally accepted.

I think that's all besides the point really.
 
  • #11
At lunch just now, I think I recall both Susskind and Smolin, particularly the latter, saying that they theorize first in an internal narrative and then fit the mathematical physics to the narrative. Smolin said that he likes to speak to lay-audiences for the rigorous clarification that he must bring.

And it is not beside the point. Interpretation is precisely the point of Popper's Quantum Theory and the Schism in Physics and the variety of interpretations has only increased in the subsequent half-century since he wrote it.
 
  • #12
The point I meant was that a computer is just a tool, and you may have a preference in using one to achieve something that you can effectively do in your head, or by hand. It's just a tool. A preference to not using a computer doesn't really fit in with what I originally put into the thread, it just so happens that using an analogy of a computer and its related terminology (a bit) can possibly help, or confuse.

The thread starts on the assumption that the amount of information in the universe is countable and computable, and I was hoping for something more in the direction of the potential layout of those 2^90 bits and what current physical laws say is possible... to what degree (if any) is the understanding of how those bits translate into our physical universe...
 
  • #13
phinds said:
One objection to this is that since it posits a finite answer, it then of necessity is using a model in which the universe it finite, but that is NOT known to be true. It MAY be true, with the universe finite but unbounded, or It may NOT be true in which case his analysis is pointless.
I believe this analysis is based off of the entropy of the cosmological horizon set by the value of the cosmological constant. It's an estimate of the number of degrees of freedom required to describe every possible universe with the cosmological constant set to the value we observe.

You can still recover an infinite universe if you allow for the possibility of a zero cosmological constant. But as long as the cosmological constant is non-zero, the entropy is finite, and therefore the number of degrees of freedom is finite.
 
  • #14
I think you just encapsulated my query better than I ever could.
 

Similar threads

  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 56 ·
2
Replies
56
Views
8K