- #1
kakaz
- 17
- 0
Please let me mention certain idea here, although it is very vague.
The general notion of algorithm is model of computation with finite number of steps which my be performed on device from "set of computational devices". Up to now "Set of Computational devices" consists of quantum and classical computers which both may be simulated on Universal Turing Machine. It means that Turing machine remains the most general model for computations ( which is in harmony with other notions of effective computable functions like lambda-calculus or recursive functions).
So let me explain motivation to as question at the end:
Turing Machine is in some way connected with notion of time (it is idealisation of computation by performing "one operation" at a time). As we discover new interesting phenomena ( famous discovery: http://arxiv.org/abs/1101.2565 mentioned also here http://www.technologyreview.com/blog/arxiv/26270/ ) probably it is time to ask what is the role of time in Turing Computation model and if it has rigorous enough definition.
Lets imagine simple Gedankenexperiment: Imagine class of TM for which initial Data and Result has to be at least 4 ( in general n - natural number) cells one after other - it looks like artificial idea, but when we refer to separation in time - it sound very natural. We use "separation in time" as very natural notion: in fact we require it - answer has to be obtained after question - right?. So, in a light of existence of phenomenon like time-entanglement - mentioned in references above - it is probably time for review... It is not clear for me if this phenomenon point on something new but it may...
Of course "notion of time" may be not important here at all: maybe notion of order is sufficient enough ( although I do not know how such change would result in quantum computations where time is important parameter I suppose, but we may say - in mathematical idelisation/model probably it may be not important).
(1) Is it new idea or it is old?
(2) Is it interesting?
(3) If ten we change notion of time order in UTM to notion of abstract order, will that change notion of algorithm/Turing Machine definition?
(4) Is there any influence of time-entitlement on our computational model discussed?
The general notion of algorithm is model of computation with finite number of steps which my be performed on device from "set of computational devices". Up to now "Set of Computational devices" consists of quantum and classical computers which both may be simulated on Universal Turing Machine. It means that Turing machine remains the most general model for computations ( which is in harmony with other notions of effective computable functions like lambda-calculus or recursive functions).
So let me explain motivation to as question at the end:
Turing Machine is in some way connected with notion of time (it is idealisation of computation by performing "one operation" at a time). As we discover new interesting phenomena ( famous discovery: http://arxiv.org/abs/1101.2565 mentioned also here http://www.technologyreview.com/blog/arxiv/26270/ ) probably it is time to ask what is the role of time in Turing Computation model and if it has rigorous enough definition.
Lets imagine simple Gedankenexperiment: Imagine class of TM for which initial Data and Result has to be at least 4 ( in general n - natural number) cells one after other - it looks like artificial idea, but when we refer to separation in time - it sound very natural. We use "separation in time" as very natural notion: in fact we require it - answer has to be obtained after question - right?. So, in a light of existence of phenomenon like time-entanglement - mentioned in references above - it is probably time for review... It is not clear for me if this phenomenon point on something new but it may...
Of course "notion of time" may be not important here at all: maybe notion of order is sufficient enough ( although I do not know how such change would result in quantum computations where time is important parameter I suppose, but we may say - in mathematical idelisation/model probably it may be not important).
(1) Is it new idea or it is old?
(2) Is it interesting?
(3) If ten we change notion of time order in UTM to notion of abstract order, will that change notion of algorithm/Turing Machine definition?
(4) Is there any influence of time-entitlement on our computational model discussed?
Last edited by a moderator: