Coin tossing -extended version-

  • Thread starter Stellaferox
  • Start date
In summary: Basically, you can approximate the mean number of runs of K heads by multiplying the sum of the number of runs of K heads up to a specified N by a constant (called the "polynomial"). Then, you can use the Poisson approximation to calculate the probability of at least one run of K in N tosses. Thanks for the explanation! It makes much more sense.
  • #1
Stellaferox
16
0
Coin tossing ---extended version---

We learned in previous posts that to obtain a string of at least K Heads, one has to flip a coin 2^(K+1)-2 times (N) on average (Markov chains), e.g. K = 3 HEADS, N would be 14.

On the other hand: flipping a coin 14 times and calculating the probability that a string of at least 3 HEADS occurs, the probability = 0.647949219

The strange thing is that doing the string of HEADS from 1 to (let's say) 10 and calculating the matching average nr of flips (N = 2^(K+1)-2), reversing this exercise and calculating the probability of occurence of such a string in N, is always around the above mentioned probability of 0.647949219.

Can someone please explain?

thnx in advance

Marc
 
Physics news on Phys.org
  • #2


These are not the same problem.
The first problem is getting the average of the number of flips to get a string of k heads. The second is the probability of a string of k length for a given string.

Let P(n,k) = prob(string of k after n flips|no string of k after n-1 flips)

The answer to the first is ∑(n=0,∞) nP(n,k)

The answer to the second is ∑(j=0,n)P(j,k)
 
  • #3


Hi Mathman,

Thank you for your answer and the compact notation in logic of what it all boils down to.

Yes I know of course. What I wanted to know is how these compared probabilites all give a result closely around 0.64. I find it hard to believe that this is coin-cidental...(forgive the pun)

Marc
 
  • #4


And, moreover, the probability seems to converge towards 0.6323 as K heads increases (with N = 2^(K+1)-2 as average number of coinflips to obtain a string of K heads)
 
  • #5


mathman said:
These are not the same problem.
The first problem is getting the average of the number of flips to get a string of k heads. The second is the probability of a string of k length for a given string.


Stellaferox is talking about a pattern that emerges when you calculate the probability of getting a run of k heads in a series of 2^(k+1)-2 tosses. It appears to tend toward some limiting value.

2 6 0.671875
3 14 0.647949
4 30 0.639057
5 62 0.635327
6 126 0.633651
7 254 0.632865
8 510 0.632487

The first column is k, the second is the number of tosses n, and the third is the probability of getting a run of k heads in n tosses.

This is neat. I had not noticed this before, and I don't know why it happens.
 
  • #6


@ techmologist,

Yes, the question is really bugging me. How should we try to solve this equation? It is a bit hard doing this by numbercrunching as the bigger powers of 2 cannot be referenced anymore by excel or programming languages.

Marc
 
  • #7


Stellaferox said:
@ techmologist,

Yes, the question is really bugging me. How should we try to solve this equation? It is a bit hard doing this by numbercrunching as the bigger powers of 2 cannot be referenced anymore by excel or programming languages.

Marc

Hi, Marc. I think I have guessed the answer (but not proved it). For large values of N and K, the number of runs of K heads in N tosses has a Poisson distribution with mean approximately N/2K+1. I got this approximate mean from considering the exact expected number of runs of K heads in N tosses, which is

[1+(N-K)/2]/2K

In the case we are considering, the mean number of runs of K heads approaches 1 as K grows large. Therefore, the probability that there is at least one run of K in N = 2K+1-2 tosses is 1 minus the probability that there are none:

P{at least one run of K heads in 2K+1-2 tosses} = 1 - P{0 runs of K heads in 2K+1-2 tosses} ~ 1-e-1 = 0.6321205588
 
  • #8


Hi Techmologist,

I think you are right! I tried to substitute 2^(K+1)-2 for N in the probabilityformula and bring it to its limit but didn't succeed. your approach is much more elegant and intuitive I might say.
I didn't think of the Poisson-distribution either. Rather stupid considering I was playing with that function for the last month!
Thank you for your thinking along.

Marc
 
  • #9


Stellaferox said:
Hi Techmologist,

I think you are right! I tried to substitute 2^(K+1)-2 for N in the probabilityformula and bring it to its limit but didn't succeed. your approach is much more elegant and intuitive I might say.
I didn't think of the Poisson-distribution either. Rather stupid considering I was playing with that function for the last month!
Thank you for your thinking along.

Marc

You're welcome. I'm glad you brought this up because otherwise I wouldn't have thought to apply the Poisson approximation to the problem of runs. I still need to check it to see for what range of numbers it gives reliable answers.

I've thought about it some more, and it is possible to see the tendency toward 1-e-1 in the exact formula when you substitute in N = 2K+1-2. Actually, since we are letting K grow large (at a much slower rate than N), the probability doesn't change much if you simply use N = 2K+1. Also, we can ignore those series of N tosses that begin with a run of K heads, since their contribution to the probability is negligible as K increases. That way, we can focus on only those series in which a tail is followed immediately by a run of K heads. So if P is the probability of a run of K heads in N = 2K+1 tosses, then

[tex]P \approx \sum_{j=1}^{\lfoor\frac{N}{K+1}\rfloor}(-1)^{j+1}\binom{N-jK}{j}\frac{1}{2^{j(K+1)}}[/tex]

substituting N for 2K+1 gives

[tex]P \approx \sum_{j=1}^{\lfoor\frac{N}{K+1}\rfloor}(-1)^{j+1}\binom{N-jK}{j}\frac{1}{N^j}[/tex]

Since N/(K+1) --> infinity and (N-jK)j/Nj --> 1 as K grows large*, the above sum simplifies to

[tex]P \approx \sum_{j=1}^{\infty}(-1)^{j+1}\frac{1}{j!} = 1 - e^{-1}[/tex]

This reasoning can be extended to get the full Poisson distribution for the number of runs of K "successes" in a series of N Bernoulli trials, with success probability p and failure probability q = 1-p. If we let [tex]N = \mu/qp^K[/tex] as K grows large, so that N >> K >> 1, then for small j, the probability that there are exactly j runs of K successes in N trials tends to

[tex]e^{-\mu}\frac{\mu^j}{j!}[/tex]


*(n)i is shorthand for n(n-1)...(n-i+1)
 

1. How does the extended version of coin tossing differ from regular coin tossing?

The extended version of coin tossing involves tossing multiple coins at once instead of just one. This allows for a wider range of possible outcomes and more complex analysis.

2. How many coins should be tossed in the extended version for accurate results?

The number of coins to be tossed depends on the specific experiment and the desired level of accuracy. However, a larger number of coins generally leads to more accurate results.

3. Can the extended version of coin tossing be used for real-world applications?

Yes, the extended version of coin tossing can be applied to various real-world scenarios such as predicting stock market trends or analyzing the outcomes of sports games.

4. Are there any limitations to the extended version of coin tossing?

Like any scientific experiment, the extended version of coin tossing has limitations. It relies on the assumption of a fair coin and does not account for external factors that may affect the outcome.

5. What statistical methods can be used to analyze the results of the extended version of coin tossing?

Various statistical methods can be used, such as calculating the mean, median, and mode of the outcomes, conducting a chi-square test, or creating a histogram to visualize the distribution of results.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
41
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
57
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
3K
  • Set Theory, Logic, Probability, Statistics
2
Replies
45
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
892
  • Programming and Computer Science
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
3K
Back
Top