1. PF Contest - Win "Conquering the Physics GRE" book! Click Here to Enter
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What would it take to fully simulate a physical system?

  1. May 13, 2015 #1
    Many people out there today seem to think that we'll soon have computers powerful enough to simulate the physical world well enough that we'll be able to upload ourselves and live in such a simulation. People really seem to think a Matrix situation is possible. Some, like Nick Bostrom, have argued that we might in fact already live in a simulation. I find this all very, very doubtful, for a number of reasons.

    I am thinking of writing up a little piece that tries to convey a sense of just how much computational power it would take to simulate even a small amount of matter with a fine enough grain to make it experimentally indistinguishable from the real thing.

    If our world is a simulation, how much computation is necessary to produce it?

    It seems to me very unlikely that it would be possible to simulate the world only at a very coarse grain most of the time, giving just enough detail to fool people when they aren't looking too closely, and only to do the calculations for very fine details when people happen to be looking in microscopes, measuring very short intervals of time, and that sort of thing. I suspect that everything at the smallest scales and the shortest time intervals would have to be calculated all the time to give rise to all the behavior we see at larger scales.

    It has to be bottom-up. If not, I think that any careful scientific investigation would reveal many inconsistencies. In order for every possible microphysical observation to be perfectly consistent with every other possible observation, microscopic or macroscopic, I tend to think that it would be necessary to simulate all of the subatomic particles all the time. I think that not simulating what we aren't looking at wouldn't work. The story would fall apart.

    Biological processes, to behave as they do, rely on a huge amount of stuff going on at very tiny scales. Our computers similarly rely on very small things. And of course, we can go to Mars, or anywhere else for that matter, examine things microscopically, and make observations consistent with a long history of microscopic processes. And it is all a very tight story. We'll never discover any inconsistencies.

    If the story given us were constructed from the top down, with a degree of simulation detail only fine enough to match our observational detail at any given time, as we drill down in our investigations from different angles, we would surely find that the story doesn't make sense, that there are inconsistencies of some kind. It would be hard to explain the success of science and the degree of consilience that we find if the world were not processed bottom-up. Everything at macroscopic scales seems to be fully accounted for by the microphysical details.

    We can find a wealth of microscopic evidence for a long history of physical processes that preceded our existence. You can't just start the simulation with the first moment a human made an observation! To have a fully consistent history that would make sense to careful scientific investigators, you'd have to calculate the whole history.

    And contrary to what some think, it wouldn't suffice to just simulate our brains at a coarse grain like a simple neural net! That's crazy! You have to also simulate the environment if you want to give us the sort of experiences that we have, with all the information available to us through any possible investigation of that environment. Basically, it would require a complete simulation of at least the environment available to us, including this planet's surface and possibly the rest of the solar system.

    I think that if I can get a realistic idea of the number of calculations it would take to fully simulate something as small as a grain of salt, and show how large a computer it would take to do that, it would show all this simulated reality stuff to be the nonsense that it is.

    Okay, so what would it take? Does the simulation need to go all the way down to the Planck length, with a time step of Planck time? Or can we just treat each subatomic particle as a simple entity with a handful of numbers associated with it? Or do you think it would be adequate to ignore particles like quarks and just simulate protons, neutrons, electrons, and so on?

    How many calculations per time step would be required for each particle? It seems to me that it wouldn't make sense to calculate the forces acting on each particle by adding up all the forces from every other particle. Rather, it might be more efficient to just keep track of the fields, calculating each particle's local effect on the fields and the effects of the fields on the particles. Does this sound right?

    I was thinking that it might make sense to use the holographic principle and just keep track of each bit that is written on each Planck area of the surface of the sphere bounding the region being simulated, maybe with a time step of Planck time. This would simplify the estimation of required computing power greatly, but I fear it might be too fine-grained and might greatly exaggerate the computational requirements.

    According to Wikipedia,
    As of May 2010 [update], the smallest time interval uncertainty in direct measurements is on the order of 12 attoseconds (1.2 × 10^−17 seconds), about 3.7 × 10^26 Planck times.

    Is there any reason to think that we might still need a time interval shorter than this?

    I am not sure how to even begin thinking about how to simulate all the quantum mechanical behavior.

    Any ideas would be appreciated.
  2. jcsd
  3. May 13, 2015 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    I'm sorry, but who exactly are these people you talked about?

    This is still a fallacy. No one can derive the phenomenon of superconductivity yet by considering individual particle interaction at a time! This is a many-body phenomenon that still can't be derived by ANY computer. Read Robert Laughlin's Nobel Prize lecture.

    So if that hasn't been shown to be possible, what is the possibility do you think of simulating "the physical world"? These people that you listen to obviously have never heard of emergent phenomenon.

  4. May 13, 2015 #3
    Have a look at this:


    This idea, surprisingly, has quite a lot of traction in the academic world, and is now pretty widespread among the lay public. I think these people probably haven't consulted physicists and computer scientists on the matter.

    About superconductivity, superfluidity, the many-body problem, and the like, I don't understand enough about it to see why a simulation wouldn't be able to produce these effects. Bear in mind that I am not a physicist! I don't have a very deep understanding here. But from what little I understand, I have the impression that something like a many-body problem is a problem of getting an exact analytic solution. But a simulation definitely allows you to approximate the behavior of many gravitational bodies, for example. This is done routinely.

    Superconductivity and superfluidity are quantum mechanical and quite beyond my understanding. Is there some reason that these can't emerge in a quantum mechanical simulation?

    I understand that you can't deduce this behavior from the behavior of individual particles. That seems like a different problem from simulation.

    The behavior of water is often given as an example of emergence. It seemingly isn't something you'd deduce from the properties of individual water molecules. But if you simulate a bunch of water molecules interacting step by step, you get the behavior of water. You might not be able to deduce exact fluid dynamical behavior from mathematical models of individual water molecules, but that doesn't mean a simulation won't yield fluid dynamics. The behavior of water is still fully determined by and emerges from the behavior of water molecules. If there were really some extra causal factor at the level of description of fluidity, then the system would involve some overdetermination. Emergence, at least in its weak form, is not incompatible with reduction.

    We can see another example in cellular automata, in Conway's Game of Life. The behavior of gliders and whatnot is emergent. And I doubt that one can easily deduce this behavior from the basic rules of this simple world. Going further, you can build a fully functional computer inside this world, and can even run software on it, like, for example, a simplified simulation of a billiards game. In this case, the behavior of billiard balls isn't something you'd deduce from the rules of Conway's Game of Life. And yet, if you have the right arrangement of the elements of that world and you compute it step by step, you get this behavior. It is fully determined by the low-level, smallest elements of the world, but isn't deducible from the basic rules or first principles.

    The world is full of stuff like this. Canyon erosion is fully determined by the behavior of the basic elements of the world and the basic laws of physics, but certainly isn't deducible from first principles. But you can definitely simulate canyon erosion using simple elements and simple laws! You can't solve some equation and get the state of a canyon at a given time as a solution. But if you start from a certain state of a bunch of particles and then simulate, step-by-step, how the system evolves, how all the particles move, without skipping any steps, you'll get erosion. In computer graphics for film, people want to be able to generate realistic landscapes, and certain simple mathematical procedures can quickly generate faked fractal mountain ranges and whatnot with arbitrary detail. But to get realistic erosion, you need to simulate soil transport and water flow and other things over a period of time. There is no simple procedural way to fake it. And there is certainly no solution to a set of equations that will yield it. There is just no way around having to do a very computationally expensive simulation of the erosion process if you want realistic erosion. Obviously, the same would be true for biological evolution or any other large-scale, high-level process.

    Is the case for superconductivity and superfluidity different from the kinds of emergence I've been describing? The lecture you mentioned is far beyond my understanding. Is it literally truly irreducible to fundamental physics? If so, this is very puzzling to me. It would seem to turn these phenomena into some sort of magic. It would seem to say that there is no way, even in principle, to account for or explain superconductivity or superfluidity in terms of something more fundamental.

    I don't know how relevant this is:


    But really, for my purposes here, I don't care too much about this emergence problem. I mostly just want to get a decent idea of how much computational power would be required to basically simulate something like a grain of salt, perhaps ignoring exotic states of matter like superconductivity and superfluidity. The amount of stuff going on even in something that small is vast. And I want to convey that by showing how many calculations per second it would take and then showing how many supercomputers it would take to perform those calculations in real time. Most people just don't appreciate how much more complex the real world is than the video games they play. Their video games give them the impression that we'll soon have games that are completely indistinguishable from reality. I want to show them that it truly would take a vast number of Tianhe-2 supercomputers, or whatever the number would be, to do the job for even a tiny bit of matter like a grain of salt, just to put the problem of simulating the world into perspective.
    Last edited: May 13, 2015
  5. May 13, 2015 #4


    User Avatar

    Staff: Mentor

    Thread closed for Moderation...
  6. May 13, 2015 #5


    User Avatar
    Staff Emeritus
    Science Advisor

    We really have no idea. For one, we can't even give the brain anything more than extremely rudimentary stimuli, let alone something close to reality. We also don't know how to fool the brain into believing these stimuli are the real things, so speculating on how much processing power it takes is like taking the cart before the horse.

    Since there's no real way to answer your questions, other than, "we don't know", this thread will remain closed.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook