Lower Bound on Binomial Distribution

Click For Summary

Discussion Overview

The discussion centers on exploring the lower bounds of a specific binomial distribution as the number of trials, N, approaches infinity, particularly when the probability of success, epsilon, is constrained to be less than or equal to 10^-3. Participants examine various mathematical approaches and bounds related to this distribution.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested, Mathematical reasoning

Main Points Raised

  • Some participants propose that the expression tends to 0 as N approaches infinity when epsilon is less than 1/2.
  • Others question the behavior of the expression when N is finite.
  • One participant suggests that a simple lower bound could be epsilon^N, while another argues that epsilon^L (where L is the floor of N/2 + 1) is a better lower bound due to the small value of epsilon.
  • There is mention of using the law of large numbers to argue that if the mean is less than 1/2, the probability of the average of N samples being greater than 1/2 approaches 0 as N increases.
  • Participants discuss the implications of approximating the binomial distribution as a Gaussian random variable and how this relates to finding N.
  • One participant expresses uncertainty about the applicability of the asymptotic behavior when epsilon is not strictly controlled.
  • Another participant emphasizes the need for epsilon to be strictly less than 0.5 for the law of large numbers to apply correctly.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the lower bounds of the binomial distribution, with multiple competing views and approaches presented throughout the discussion.

Contextual Notes

Some participants highlight the dependence of their arguments on the specific values of epsilon and N, as well as the assumptions made regarding the behavior of the binomial distribution in the limit as N approaches infinity.

EngWiPy
Messages
1,361
Reaction score
61
Hello all,

Is there any lower bound on the following Binomial distribution

\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}

as N goes to infinity and where epsilon is less that or equal 10^-3?

Thanks
 
Physics news on Phys.org
It looks like the expression -> 0, for \epsilon \lt \frac{1}{2}.
 
mathman said:
It looks like the expression -> 0, for \epsilon \lt \frac{1}{2}.

What if N is finite?
 
S_David said:
Hello all,

Is there any lower bound on the following Binomial distribution

\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}

as N goes to infinity and where epsilon is less that or equal 10^-3?

Thanks

I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##
 
geoffrey159 said:
I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##

The upper bound expression you reached is similar to something I have read. It was exactly:

\left[2\sqrt{\epsilon(1-\epsilon)}\,\right]^N\frac{1-\epsilon}{\epsilon}

However, the latter is not a tight upper bound. Would your upper bound be a tight upper bound? In my case, epsilon is a Gaussian Q function, so what I did is that I took only ##\epsilon^L## where ##L=floor(N/2)+1##, and then used the Chernoff bound of the Q function as ##Q(x)\le e^{-x^2/2}##. But as you can see, I used first lower bound but then upper bound, which isn't right. However, I have found that this upper bound is a better upper bound than the upper bound I provided earlier by simulations. I need to check if yours is yet better.

I need a lower bound for the Gaussian Q function then, or a new lower bound for the binomial distribution. Is there a simple one to find L or N in closed form given the lower bound, such that if the lower bound is ##LP=f(x,L)## I can find ##L=f^{-1}(LP,x)##?

Thanks
 
Last edited:
I am not competent to answer you about Chernoff bound, I don't know much about probabilities.
There is an easy lower bound that is ##\epsilon^N## since you are summing positive terms.
 
geoffrey159 said:
I am not competent to answer you about Chernoff bound, I don't know much about probabilities.
There is an easy lower bound that is ##\epsilon^N## since you are summing positive terms.

I need a tight lower bound. ##\epsilon^L## is better than ##\epsilon^N## as a lower bound, since ##\epsilon## is small, and thus the terms with lower exponents dominate the result.
 
I am relying on the law of large numbers. If the mean < 1/2, then the probability of the average of N samples > 1/2 goes to 0, as N becomes infinite.
 
mathman said:
I am relying on the law of large numbers. If the mean < 1/2, then the probability of the average of N samples > 1/2 goes to 0, as N becomes infinite.

I think, since ##\epsilon## is small, we can just take the first term ##{N\choose L}\epsilon^L(1-\epsilon)^{N-L}##, which is approximated as a Gaussian random variable with mean ##N\epsilon## and variance ##N\epsilon(1-\epsilon)## as ##N## goes to ##\infty##. But this approximation is not helpful for me to find ##N##.
 
  • #10
S_David said:
I think, since ##\epsilon## is small, we can just take the first term ##{N\choose L}\epsilon^L(1-\epsilon)^{N-L}##, which is approximated as a Gaussian random variable with mean ##N\epsilon## and variance ##N\epsilon(1-\epsilon)## as ##N## goes to ##\infty##. But this approximation is not helpful for me to find ##N##.
What N are you looking for? The question is: what happens when as N\to \infty?
 
  • #11
geoffrey159 said:
I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##

Why ##(2\sqrt{\epsilon})^N \rightarrow 0 ## as ##N\rightarrow \infty##?
 
  • #12
S_David said:
Why ##(2\sqrt{\epsilon})^N \rightarrow 0 ## as ##N\rightarrow \infty##?
2\sqrt{\epsilon}&lt;1
 
  • #13
mathman said:
2\sqrt{\epsilon}&lt;1

You asked me why I need to find N. My actual problem is that I have a limit on ##\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}\leq \eta##. I have two variables here ##\epsilon\leq 0.5## and ##N##, and ##\epsilon## is a function of ##N##. In particular, ##\epsilon=Q\left[\sqrt{\frac{2\,x/N}{y/N+1}}\right]##, where ##x## and ##y## are given, and ##Q[.]## is the ##Q## Gaussian function. I need to find ##N## such that the inequality holds. I said that ##\epsilon\leq 10^{-3}##, but actually, I don't have control over ##\epsilon##. All I know is that ##\epsilon\leq 0.5##. I think in this case, the asymptote doesn't hold always, does it?
 
  • #14
S_David said:
You asked me why I need to find N. My actual problem is that I have a limit on ##\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}\leq \eta##. I have two variables here ##\epsilon\leq 0.5## and ##N##, and ##\epsilon## is a function of ##N##. In particular, ##\epsilon=Q\left[\sqrt{\frac{2\,x/N}{y/N+1}}\right]##, where ##x## and ##y## are given, and ##Q[.]## is the ##Q## Gaussian function. I need to find ##N## such that the inequality holds. I said that ##\epsilon\leq 10^{-3}##, but actually, I don't have control over ##\epsilon##. All I know is that ##\epsilon\leq 0.5##. I think in this case, the asymptote doesn't hold always, does it?
You need \epsilon \lt 0.5, if = 0.5, it is ambiguous, since the law of large numbers requires an open interval <0.5 around \epsilon.
 
  • #15
mathman said:
You need \epsilon \lt 0.5, if = 0.5, it is ambiguous, since the law of large numbers requires an open interval <0.5 around \epsilon.

I think we can safely assume it is less than 0.5.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 29 ·
Replies
29
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K