Regarding probability bound of flip coins

AI Thread Summary
The discussion centers on characterizing the distribution of heads when flipping a fair coin 10,000 times. A key point is the probability bound P[head>n/2 + k√n] < e^(-k^2)/2, which is noted as an approximation for the tail of the Bernoulli distribution for large samples. Participants express confusion over the lack of rigorous proof in the textbook, suggesting that the result may not directly follow from the Bernoulli distribution and Chebyshev inequality. The mean and variance of independent coin tosses are discussed, highlighting that the mean is N/2 and the variance is N/4. To derive the probability bound, an integral approximation for the tail is recommended, along with a need to relate certain inequalities.
f24u7
Messages
46
Reaction score
0
Suppose you flip a fair coin 10,000 time how can you characterize the distribution of the occurrence of head?

From the textbook, it says that P[head>n/2 + k√n] < e^(-k^2)/2, why is that and what is the derivation? What theorem is this, we had only learn Bernoulli distribution and Chebyshev so far, it seem odd that the textbook would jump to such a conclusion without rigorous proof.

Thanks in advance
 
Physics news on Phys.org
I haven't worked on it, but it looks like an approximation for the tail of the Bernoulli distribution for a large sample.
 
thanks for the reply, could you give me a hint for how would you go about deriving this
 
f24u7 said:
thanks for the reply, could you give me a hint for how would you go about deriving this

The first step is to approximate the tail by an integral.
 
f24u7 said:
we had only learn Bernoulli distribution and Chebyshev so far

I haven't worked on the problem either. I agree that the result is not a simple consequence of the bernoulli distribution and the Chebyshev inequality, but you might be able to prove it from them with some work.

The mean of N independent tosses of a fair coin (landing "0" or "1") is the sum of the mean of the results of the individual tosses and the variance is the sum of the individual variances. So you have a mean of N/2 and variance of N(1/2)(1-1/2). To prove the result from the Chebyshev inequality, you'd need to work out an inequality relating 1/x^2 and e^(-x^2).
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top