Tanelorn
- 906
- 15
Chalnoth, sorry I think I am just repeating myself so perhaps I don't make any sense. If I may, just one last time:
Consider the video game analogy, all that is required to compute iteration n is iteration n-1. It is not necessary to store iteration n-2, or any older iteration data, so in an efficient system any data older than iteration n-1 would be discarded as unnecessary and therefore no longer exist.
General relativity, special relativity, and frames of reference effects not withstanding, I am suggesting that only particle interaction data in time slice n-1 are required to be able to compute or create time slice n, nothing more than this.
From this perspective we would literally have to reverse the direction of time (equivalent to the computer clock, which unfortunately can only be positive!) for the entire universe to be able take the present time slice backwards in time and we have no control over the direction of the arrow of time at least from inside our Universe. The past is therefore gone and the only way of recreating a past iteration would be to start the whole thing over again from time slice or iteration 0.
The Universe in this model resembles a massively parallel computer and perhaps the finite speed of light and other properties of the Universe are due to the equivalent of computer hardware limitations. Or perhaps the finite speed of light is just necessary just to prevent everything in the whole universe interacting simultaneously with everything else. Cause and effect could potentially become completely unstable if the speed of light were not finite.
I saw an episode of wormhole recently where they proposed another "computer" metaphor like this, and they mentioned how particle interactions, or was it diffraction patterns, become more precise depending on how closely they are being observed. i.e. the "computer" appears to calculate particle data more precisely depending on whether or not a more exact measurement is being required.
Consider the video game analogy, all that is required to compute iteration n is iteration n-1. It is not necessary to store iteration n-2, or any older iteration data, so in an efficient system any data older than iteration n-1 would be discarded as unnecessary and therefore no longer exist.
General relativity, special relativity, and frames of reference effects not withstanding, I am suggesting that only particle interaction data in time slice n-1 are required to be able to compute or create time slice n, nothing more than this.
From this perspective we would literally have to reverse the direction of time (equivalent to the computer clock, which unfortunately can only be positive!) for the entire universe to be able take the present time slice backwards in time and we have no control over the direction of the arrow of time at least from inside our Universe. The past is therefore gone and the only way of recreating a past iteration would be to start the whole thing over again from time slice or iteration 0.
The Universe in this model resembles a massively parallel computer and perhaps the finite speed of light and other properties of the Universe are due to the equivalent of computer hardware limitations. Or perhaps the finite speed of light is just necessary just to prevent everything in the whole universe interacting simultaneously with everything else. Cause and effect could potentially become completely unstable if the speed of light were not finite.
I saw an episode of wormhole recently where they proposed another "computer" metaphor like this, and they mentioned how particle interactions, or was it diffraction patterns, become more precise depending on how closely they are being observed. i.e. the "computer" appears to calculate particle data more precisely depending on whether or not a more exact measurement is being required.
Last edited: