I've been reading book called 'The Big Questions: Physics' by Michael Brooks during my trip to Warsaw. It's an enjoyable collection of essays despite the fact that they were perhaps slightly too basic at times for a physics student like me. What grabbed my attention though was the simulated reality hypothesis (http://en.wikipedia.org/wiki/Simulated_reality). Up until now I treated the idea of us all being stuck in Matrix spin-off as an entertaining thought but unscientific, untestable and solipsistic. However what sparked my interest is the argument from "conservative computing": - You never want to waste computing power when programming a new software. - This means that any simulation will not be infinitely smooth. - When we get down to the quantum level things get - as we all know - very wierd. Can it be that the quantum wierdness is simply us getting closer to the very basics of the running simulation? Is the cat both dead and alive until someone measures it a way of saving computing power when nobody and nothing interacts with a particle hence why waste the rendering power? I won't go very deep into it right away. Surely somebody in here must have heard of this before. How valid is this hypothesis? Is it scientifically sound or just philosophical and untestable hogwash? Cheers.