How Much Info for Building a Universe?

  • Context: Graduate 
  • Thread starter Thread starter wolram
  • Start date Start date
  • Tags Tags
    Building Universe
Click For Summary

Discussion Overview

The discussion centers around the question of how much information is required to construct a universe, exploring concepts such as the information content of the universe, the potential conservation of information, and various simulations related to quantum gravity. Participants reference theoretical frameworks, past research, and ongoing challenges in the field.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants inquire about the number of bits needed to build a universe, referencing J Baez's simulations and the concept of information content in the universe.
  • Others suggest that there may be a conservation of information law in the universe, proposing that complexity and chaos are balanced.
  • One participant mentions John Baez's collaboration on quantum gravity simulations, noting that they were conducted on a large computer rather than a desktop.
  • Another participant refers to the work of Renate Loll and Jan Ambjorn, highlighting their 2007 article on quantum gravity simulations and the emergence of space-time.
  • There is a discussion about the challenges of discretizing spacetime in causal dynamical triangulation (CDT) and the hope that finer discretization will reveal robust features of the continuum.
  • Participants express uncertainty about the current status of simulations aimed at penetrating the sub-Planckian regime and whether research has shifted focus since earlier papers.
  • One participant presents a calculation based on the Holographic Principle, estimating the information content of the observable universe and discussing the implications of the event horizon.
  • Another participant emphasizes that the nature of the information in the universe remains unknown, suggesting that the observable universe is a theoretical construct with potential limitations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the amount of information needed to build a universe or the implications of the Holographic Principle. Multiple competing views and uncertainties remain regarding the nature of information and the status of ongoing research in quantum gravity simulations.

Contextual Notes

Participants note limitations in their understanding of the current state of research, particularly regarding the challenges faced in simulations and the definitions of key concepts like information and the observable universe.

wolram
Gold Member
Dearly Missed
Messages
4,411
Reaction score
551
How many bits of information are needed to build a universe, I think J Baez ran simulations on an ordinary desk top, sorry i may be wrong, but any way how much information would be needed?
 
Physics news on Phys.org
wolram said:
How many bits of information are needed to build a universe, I think J Baez ran simulations on an ordinary desk top, sorry i may be wrong, but any way how much information would be needed?

Or asked a different way, what is the information content of the universe? Some wonder if there isn't a conservation of information law in the universe so that complexity is balanced by chaos for the universe as a whole.
 
John Baez partnered with Dan Christensen (U of W. Ontario) on some QG simuilations, but it was on a large computer.

Renate Loll and Jan Ambjorn did some simulations and one of their 2007 articles had a title like
"The Emergence of Space-Time: Quantum Gravity on Your Desktop"

If I remember right, the Nobel laureate George Smoot has a 2009 paper where he and coauthor(s) estimate the area of the cosmological event horizon, and estimate the entropy of the universe contained within that spherical horizon. I've vague on that, only a slight memory of the paper. Maybe I can find a link and someone else can read/interpret.

http://arxiv.org/abs/1003.1528
see equation (8) on page 4 and further discussion on page 5 which talks about the number of bits. This might help. I don't want to try to interpret. Too speculative for me.
 
Last edited:
The sort of simulations in CDT are meant not rigourous. They discretize spacetime. The hope is that as the discretization is made finer and finer, the qualitative features won't change, indicating some robust feature of the continuum. At present, CDT's discretization is only on the order of the Planck length, so they are now running simulations which are trying to be quite a bit finer.
 
atyy said:
... so they are now running simulations which are trying to be quite a bit finer.

Do you have recent news to that effect?

I know at one point they were trying to force the model to zoom in further, but this encounters serious obstacles, described in their 2009 writeup of their 2008 lectures. There recent papers have been about other stuff, as if that initiative is now on the back burner.

Please share if you have any current indications.
 
p62 of http://arxiv.org/abs/0906.3947

"We have also indicated how we may be able to penetrate into the sub-Planckian regime by suitably changing the bare coupling constants. By “sub-Planckian regime” we mean that the lattice spacing a is (much) smaller than the Planck length. While we have not yet analyzed this region in detail, we expect to eventually observe a breakdown of the semi-classical approximation. This will hopefully allow us to make contact with continuum attempts to define a theory of quantum gravity based on quantum field theory."
 
atyy said:

Atyy, we both saw that. I was asking if you had any recent indications.
That paper is a writeup of lectures given 3 years ago.
" lectures given at the summer school 'New Paths Towards Quantum Gravity', May 12-16 2008. "

Since they posted the June 2009 writeup, Loll and friends have posted a half dozen other papers about their current research and I haven't seen anything about transplanckian sims.

That's why I asked if you had any recent information to back up what you said about current work:
"so they are now running simulations which are trying to be quite a bit finer"

There were serious obstacles in 2008 (and 2009 when they did the writeup) so did they give up? Did they put it on the back burner and turn their attention on half a dozen other topics. Or are they still actively trying?
I thought you might know. One could always email Loll, I guess.
 
Last edited:
Oh, sorry, my time scale for "recent" was still much larger than the Planck time.
 
To give an idea of how big the universe is I made this table.
Each level is 256 times bigger than the one above.
x= -6 is exactly one angstrom.
x= 10 is 36 times bigger than the observable universe.
http://www.technologyreview.com/blog/arxiv/26333​

powersoftwo.png
 
Last edited:
  • #10
Nobody knows what the information is.
We observe relations between information and we may count it (approximatly).
Our Observable Universe (Hubble radius) is a theoretical provisory, tentative volume which due to Holographic Principle may contain bits of information (relations of information) = (Area of the Hubble sphere) /divided by (4 Planck length squared). Beckenstein bound:
http://en.wikipedia.org/wiki/Black_hole_thermodynamics

Area of the Hubble sphere is our Observable Event Horizon which grows and actually is much larger because we observe as it was before 13 bilions years. Actually its expanssion accelerates. It is a consequence of the Holographic Principle. An article of Smoot, Frampton.
http://arxiv.org/abs/1002.4278

Our Observable Universe is a small part of the real Universe (nobody knows how large it is).
http://www.mso.anu.edu.au/~charley/papers/DavisLineweaver04.pdf

If Holographic Principle is right we can find:
(Surface of the Observable Hubble Event Horizon 10^54 m^2) / (4 Planck Lengt squared 10^-70 m^2) = about 10^124 bits.

This information are tightly packed on the surface of the Event Horizon but inside in the volume there is a filament connecting the objects. We observe object when the light comes from it to us. The photon of light interferes with the vacuum (information in the space) and it costs Planck time dilation for each interference and we measure the time and distance between objects.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
7K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
3K
  • · Replies 0 ·
Replies
0
Views
336
  • · Replies 2 ·
Replies
2
Views
999
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K