Probabilistic complexity classes and acceptable sources of entropy

In summary, there are algorithms that can simulate the behavior of a probabilistic Turing machine, but their complexity and efficiency can vary greatly and it is not always possible to find a perfect simulation. The complexity of these algorithms is an active area of research, and the answers to your questions may change depending on the specific properties and design of the algorithm in question.
  • #1
Coin
566
1
(This may be a little bit outside the normal scope of this particular forum. I realized after writing this I didn't really know where to go with questions about computational complexity. Does anyone have any recommendations as to places where this question might be more appropriate? Anyway:)

So I've got this question. The question started out as "is randomness computable?" but I realized I didn't have a very precise idea of what I meant by "randomness" or "computable". So, never mind that question. Here's I think the question I *really* want to ask:

Let's say that I've got a standard probabilistic turing machine. It's a turing machine, and it has an entropy source it can consult to get random bits. It's running a program I'll call D. D is an algorithm in BPP. So when I run D on my probabilistic turing machine, it runs in polynomial time and is for any given input wrong at most 1/3 of the time.

Now let's consider a slightly different situation. I've got two *deterministic* turing machines. Let's call them T1 and T2. These turing machines are totally normal, except the first one (T1) is rigged up to *think* it's a probabilistic turing machine. It has an "entropy source" it can consult for bits. But the "entropy source" is actually just T2. T2 is loaded up with a program (let's call it R) that when run produces a stream of bits according to some algorithm; R's input tells it how many bits to produce before shutting off. (I'm assuming this detail makes no difference, but let's say R works such that the output of R(2) is a prefix of the output of R(4).) When T1 consults its "entropy source" for the first time, we run R(1) or something on T2 and feed T1 the first bit on T2's output tape. When T1 consults the entropy source a second time we run R(2) and feed T1 the second bit on T2's output tape... and so on.

So, here's my question, in three parts:

1. Is there *any* algorithm R such that when we run D on T1, D will perform the way it did on the probabilistic turing machine-- i.e., D runs in polynomial time, and for any given input is wrong at most 1/3 of the time? (Let's say that when we time D, we only count steps taken on T1-- we don't don't count any time spent running or rerunning R on T2 to get more entropy. Let's also say, just to make this work, that we calculate that "1/3 of the time" by rerunning D on T1 multiple times without "resetting" T2-- such that if the first run of D consumes k bits of entropy from T2, then the first bit requested by the second run of D will be the (k+1)th bit of R(k+1).)

2. If R(n) exists, is there *any* known bound on its runtime complexity with respect to n?

3. Do the answers to any of the above questions change if rather than D being in BPP, it had been in some other probabilistic complexity class such as PP or BPL or something?
 
Physics news on Phys.org
  • #2


Hello, thank you for your question. Computational complexity is a fascinating and complex topic, and it is understandable that you may have some difficulty finding the appropriate forum to discuss it in. I would recommend checking out forums or discussion boards dedicated to theoretical computer science, as this is a topic that falls within that field of study.

To address your question, let's break it down into its three parts.

1. Is there any algorithm R such that when we run D on T1, D will perform the way it did on the probabilistic Turing machine?

The short answer is yes, there are algorithms that can simulate the behavior of a probabilistic Turing machine. These are known as pseudorandom number generators (PRNGs) and they are commonly used in computer simulations and cryptographic applications. However, the complexity of these algorithms can vary greatly and it is not always possible to find a PRNG that will perfectly mimic the behavior of a probabilistic Turing machine.

2. If R(n) exists, is there any known bound on its runtime complexity with respect to n?

The complexity of PRNGs is an active area of research, and there are many different types of PRNGs with varying levels of efficiency and complexity. Some PRNGs are designed to have a specific runtime complexity, while others may have a more unpredictable runtime. Without knowing the specific properties and design of R(n), it is difficult to determine a bound on its runtime complexity.

3. Do the answers to any of the above questions change if D is in a different probabilistic complexity class?

The answer to this question depends on the specific complexity class that D belongs to. Different complexity classes have different requirements and properties, so it is possible that the answers may change depending on which class D belongs to. However, without knowing the specific complexity class of D, it is difficult to provide a definitive answer.

Overall, your question raises interesting points about the relationship between deterministic and probabilistic algorithms, and the role of randomness in computation. I would recommend seeking out forums or discussion boards dedicated to theoretical computer science for more in-depth discussions on this topic.
 

1. What are probabilistic complexity classes?

Probabilistic complexity classes are a set of computational complexity classes that measure the resources (such as time and space) required to solve a problem with a probabilistic algorithm. These classes are used to classify problems based on their inherent difficulty and to analyze the efficiency of algorithms.

2. What is an acceptable source of entropy?

An acceptable source of entropy is a source that provides a sufficient amount of randomness for cryptographic purposes. This can include physical sources such as atmospheric noise, or mathematical sources such as the output of a cryptographic hash function.

3. How are probabilistic complexity classes different from deterministic complexity classes?

Probabilistic complexity classes allow for the use of randomness in solving problems, while deterministic complexity classes do not. This means that probabilistic algorithms may have a higher success rate in solving a problem, but they also have a risk of producing incorrect results. Deterministic algorithms, on the other hand, will always produce the same result for a given input.

4. How are probabilistic complexity classes important in cryptography?

Probabilistic complexity classes play a crucial role in cryptography as they are used to analyze the security of cryptographic systems. By understanding the probabilistic complexity class of a problem, cryptographers can determine the level of difficulty in breaking the system and design stronger and more secure algorithms.

5. Can the complexity class of a problem change based on the source of entropy used?

Yes, the complexity class of a problem can change based on the source of entropy used. For example, a problem that may be in the polynomial time complexity class when using a high-quality source of entropy, may become an exponential time complexity class when using a weaker source of entropy. This is why it is important to carefully select and test the source of entropy used in cryptographic systems.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Programming and Computer Science
Replies
29
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
14
Views
1K
Replies
15
Views
1K
  • Programming and Computer Science
Replies
2
Views
774
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
332
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
Back
Top