Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Probabilitstic versus non-deterministic Turing machines

  1. Jun 12, 2013 #1


    User Avatar
    Gold Member

    Although this question is more theoretical than most of the threads in this rubric, it still seems to me to fit the description "Computer Science". If I am wrong, perhaps someone would tell me where the question belongs.

    The question is as follows: After reading the descriptions of non-deterministic and of probabilistic Turing machines, it appears that one way to characterise the difference would be to say that the results of a probabilistic Turing machine follow the laws of probability, whereas the output of a non-deterministic computer follows no pattern. Is this correct?

    Thank you.
  2. jcsd
  3. Jun 13, 2013 #2
    it seems that your description of a probabilistic Turing machine is reasonable.

    I think this is a bit different from what you are describing.

    I believe it is more constrained than "follows no pattern" implies.

    The description I remember from far far too long ago, and thus might be scrambled, was that a non-deterministic Turing machine could be thought of as having a whole collection of states, tapes and positions instead of of a single unique state, tape and position at each moment in time. This meant that at some point for some inputs the machine would replicate itself into two or more machines, one for each possible next state, tape and position when there was more than one possible choice to be made.

    A non-deterministic machine starts out with a single state, tape and position. It might remain like that as it move from state to state doing the processing. But at some point there might be two or more different "next steps" to take. So you imagine that you suddenly have two or more machines now, each one taking one of the paths.

    I vaguely think that the first one which got to an accepting state then terminated the process, but I'm less certain about that.

    But there is a pattern. It is as if I give you a stack of papers go grade and a book of instructions on how to do this. You start working and at some point realize there are two contradictory instructions in the book. So you clone yourself and all the work on your desk and the book, one of you follows one instruction, the other the other and you both keep working. There is a clear pattern to that.
  4. Jun 13, 2013 #3


    User Avatar
    Gold Member

    Thank you for the enlightening explanation, Bill Simpson.
    Indeed, it was precisely upon the Wiki links you sent that I based my question.
    However, the following question then nags at me:

    Following your explanation of "clones", if one then united all of the clones into a single machine (e.g., put them all into a black box that only gave, at each generation, the collection of outputs from all the individual clones), then there would be a pattern of the outputs. This would, as you said, have a pattern if the strength of each splitting was known. Therefore the end result would be a set of results which would be determined. You said that the difference also lies in that the probabilistic machine eventually selects a single result, but if the non-deterministic one did not, then its result, a set, would be determined, which would give lie to the name.

    Hopefully this query will reveal what I still am not getting here.
  5. Jun 14, 2013 #4
    If I remember correctly, all forms of Turing machines are equivalent in power, as long as you are willing to ignore how long it takes and how much tape and table storage might be required during the process. Multiple tape machines are exactly as powerful as single tape machines as random access machines as non-deterministic machines. In other words, given any one of them you can write another one of them that will do the same thing.

    I believe it is also the case that the intermediate state of the tape is ignored, it is only when the machine finally halts, or never halts, that you consider the result. So your concern about what is happening during the process may be getting in the way of your understanding all this.

    It has been too many years, but I vaguely think I might have learned that for non-deterministic machines it was the one that halted in an accepting state which was "the answer." You really should check that to make sure I'm not misleading you. But I think it ended up with "the answer" and not a whole set of answers.

    I don't recall having looked at probabilistic Turing machines. But again I assume they are strictly equal in power and also come up with "the answer" when they reach a final state.

    Sometimes seeing how to simulate one form of Turing machine using another is pretty obvious. Sometimes it was very difficult.

    I have this feeling that I still haven't understood exactly what it is that you are stuck on. Keep trying and maybe we will find what it is.
  6. Jun 14, 2013 #5


    User Avatar
    Gold Member

    Thanks for the reply, Bill Simpson. You are correct, the theory says that the only essential difference between the three types of TM's is their efficiency. But there is the sticking point: if they are all equivalent, then they should all give the same answer to the same questions. That is, when one type of TM could give an answer, so could the others. But this is not the case: a probabilistic TM, such as a quantum computer, can give an answer that, according to the theorems (Bell, Kocher-Specker, Pusy-Barret-Randolf) which almost rule out hidden variable theories, cannot be obtained by any deterministic process such as exemplified by a TM. Or, more simply, the same question posed twice to a deterministic TM will give the same answer each time, whereas the same question posed twice to a probabilistic TM can give two different answers. Hence it does not seem that the TM's would be equivalent. Somewhere this reasoning is undoubtedly false, but I do not see where.
  7. Jun 16, 2013 #6
    Good. But "efficiency" is a slippery term. It is easy to create TM that require huge tables and numbers of transitions while someone very bright can sometimes produce something with far smaller tables and numbers of transitions.

    This idea was constructed to prove things about what was possible, not to fret about an optimizing compiler being or not being able to reduce the number of steps needed.

    A probabilistic machine may make a different sequence of transitions, but it isn't clear that this means you cannot create a machine that gives the correct output every time when it enters the halt state. I think you can easily construct some artificial example of a probabilistic TM which does some useless probabilistic steps before making any real progress and then starts making transitions, each with probability 1, that do the actual work and the end result is identical to an "ordinary" TM, but after initially wasting steps. Nothing in the description says anything about what the probabilities have to be.

    I suppose this can be settled if you can select a problem and produce a proof that one of the variants of TM can produce a solution and one other variant of TM cannot produce a solution. But I suspect this is impossible to do and the real difficulty lies in a misunderstanding of some detail.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Threads - Probabilitstic versus deterministic Date
C/C++ question, const versus #define for defining array. Sep 23, 2011
F77 versus ifort! May 29, 2011
System library versus standard library Sep 17, 2010
Deterministic time/space for an algorithm? Oct 8, 2007