The article can be found here: http://www.washington.edu/news/2012...lation-uw-researchers-say-idea-can-be-tested/(adsbygoogle = window.adsbygoogle || []).push({});

Here is the actual paper: http://arxiv.org/abs/1210.1847

My question is not about the validity of his premises - because I actually don't believe they logically lead to the proposed conclusion but about the test itself.

"Currently, supercomputers using a technique called lattice quantum chromodynamics and starting from the fundamental physical laws that govern the universe can simulate only a very small portion of the universe, on the scale of one 100-trillionth of a meter, a little larger than the nucleus of an atom, said Martin Savage, a UW physics professor."

My understanding of physics is moderate however not in-depth. This is the general statement on QCD

Lattice QCD is a well-established non-perturbative approach to solving the quantum chromodynamics (QCD) theory of quarks and gluons. It is a lattice gauge theory formulated on a grid or lattice of points in space and time. When the size of the lattice is taken infinitely large and its sites infinitesimally close to each other, the continuum QCD is recovered.[1]

Could someone perhaps explain to me in laymans terms why QCD is being supposed for the simulation engine - is this necessary or contingent?

I find this idea fascinating and I understand the heuristic approach I'm simply trying to fill in the gaps in my understanding. Thanks in advance

**Physics Forums - The Fusion of Science and Community**

# Nick Bostrom's Simulation Hypothesis

Know someone interested in this topic? Share a link to this question via email,
Google+,
Twitter, or
Facebook

- Similar discussions for: Nick Bostrom's Simulation Hypothesis

Loading...

**Physics Forums - The Fusion of Science and Community**