Lower Bound on Binomial Distribution

In summary, the conversation discusses the lower bound of a Binomial distribution as N approaches infinity, where epsilon is less than or equal to 10^-3. The participants provide different upper and lower bounds, and discuss the use of the law of large numbers to find a tight lower bound. They also mention the need for epsilon to be less than 0.5 for the law of large numbers to hold.
  • #1
EngWiPy
1,368
61
Hello all,

Is there any lower bound on the following Binomial distribution

[tex]\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}[/tex]

as N goes to infinity and where epsilon is less that or equal 10^-3?

Thanks
 
Physics news on Phys.org
  • #2
It looks like the expression -> 0, for [itex]\epsilon \lt \frac{1}{2}[/itex].
 
  • #3
mathman said:
It looks like the expression -> 0, for [itex]\epsilon \lt \frac{1}{2}[/itex].

What if N is finite?
 
  • #4
S_David said:
Hello all,

Is there any lower bound on the following Binomial distribution

[tex]\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}[/tex]

as N goes to infinity and where epsilon is less that or equal 10^-3?

Thanks

I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##
 
  • #5
geoffrey159 said:
I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##

The upper bound expression you reached is similar to something I have read. It was exactly:

[tex]\left[2\sqrt{\epsilon(1-\epsilon)}\,\right]^N\frac{1-\epsilon}{\epsilon}[/tex]

However, the latter is not a tight upper bound. Would your upper bound be a tight upper bound? In my case, epsilon is a Gaussian Q function, so what I did is that I took only ##\epsilon^L## where ##L=floor(N/2)+1##, and then used the Chernoff bound of the Q function as ##Q(x)\le e^{-x^2/2}##. But as you can see, I used first lower bound but then upper bound, which isn't right. However, I have found that this upper bound is a better upper bound than the upper bound I provided earlier by simulations. I need to check if yours is yet better.

I need a lower bound for the Gaussian Q function then, or a new lower bound for the binomial distribution. Is there a simple one to find L or N in closed form given the lower bound, such that if the lower bound is ##LP=f(x,L)## I can find ##L=f^{-1}(LP,x)##?

Thanks
 
Last edited:
  • #6
I am not competent to answer you about Chernoff bound, I don't know much about probabilities.
There is an easy lower bound that is ##\epsilon^N## since you are summing positive terms.
 
  • #7
geoffrey159 said:
I am not competent to answer you about Chernoff bound, I don't know much about probabilities.
There is an easy lower bound that is ##\epsilon^N## since you are summing positive terms.

I need a tight lower bound. ##\epsilon^L## is better than ##\epsilon^N## as a lower bound, since ##\epsilon## is small, and thus the terms with lower exponents dominate the result.
 
  • #8
I am relying on the law of large numbers. If the mean < 1/2, then the probability of the average of N samples > 1/2 goes to 0, as N becomes infinite.
 
  • #9
mathman said:
I am relying on the law of large numbers. If the mean < 1/2, then the probability of the average of N samples > 1/2 goes to 0, as N becomes infinite.

I think, since ##\epsilon## is small, we can just take the first term ##{N\choose L}\epsilon^L(1-\epsilon)^{N-L}##, which is approximated as a Gaussian random variable with mean ##N\epsilon## and variance ##N\epsilon(1-\epsilon)## as ##N## goes to ##\infty##. But this approximation is not helpful for me to find ##N##.
 
  • #10
S_David said:
I think, since ##\epsilon## is small, we can just take the first term ##{N\choose L}\epsilon^L(1-\epsilon)^{N-L}##, which is approximated as a Gaussian random variable with mean ##N\epsilon## and variance ##N\epsilon(1-\epsilon)## as ##N## goes to ##\infty##. But this approximation is not helpful for me to find ##N##.
What N are you looking for? The question is: what happens when as [itex] N\to \infty[/itex]?
 
  • #11
geoffrey159 said:
I believe your expression tends to 0 as ##n \rightarrow \infty##, but I may be mistaking:

Since ##0<\epsilon \le 0.001##, you get that ##0 < \epsilon^k \le \epsilon^{floor(N/2)+1} \le \epsilon ^{N/2}## and ## 0 < (1-\epsilon)^{N-k} \le 1 ##

So that ##0\le \sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k} \le (2\sqrt{\epsilon})^N \rightarrow 0 ##

Why ##(2\sqrt{\epsilon})^N \rightarrow 0 ## as ##N\rightarrow \infty##?
 
  • #12
S_David said:
Why ##(2\sqrt{\epsilon})^N \rightarrow 0 ## as ##N\rightarrow \infty##?
[itex]2\sqrt{\epsilon}<1[/itex]
 
  • #13
mathman said:
[itex]2\sqrt{\epsilon}<1[/itex]

You asked me why I need to find N. My actual problem is that I have a limit on ##\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}\leq \eta##. I have two variables here ##\epsilon\leq 0.5## and ##N##, and ##\epsilon## is a function of ##N##. In particular, ##\epsilon=Q\left[\sqrt{\frac{2\,x/N}{y/N+1}}\right]##, where ##x## and ##y## are given, and ##Q[.]## is the ##Q## Gaussian function. I need to find ##N## such that the inequality holds. I said that ##\epsilon\leq 10^{-3}##, but actually, I don't have control over ##\epsilon##. All I know is that ##\epsilon\leq 0.5##. I think in this case, the asymptote doesn't hold always, does it?
 
  • #14
S_David said:
You asked me why I need to find N. My actual problem is that I have a limit on ##\sum_{k=floor(N/2)+1}^N{N\choose k}\epsilon^k(1-\epsilon)^{N-k}\leq \eta##. I have two variables here ##\epsilon\leq 0.5## and ##N##, and ##\epsilon## is a function of ##N##. In particular, ##\epsilon=Q\left[\sqrt{\frac{2\,x/N}{y/N+1}}\right]##, where ##x## and ##y## are given, and ##Q[.]## is the ##Q## Gaussian function. I need to find ##N## such that the inequality holds. I said that ##\epsilon\leq 10^{-3}##, but actually, I don't have control over ##\epsilon##. All I know is that ##\epsilon\leq 0.5##. I think in this case, the asymptote doesn't hold always, does it?
You need [itex]\epsilon \lt 0.5[/itex], if = 0.5, it is ambiguous, since the law of large numbers requires an open interval <0.5 around [itex]\epsilon[/itex].
 
  • #15
mathman said:
You need [itex]\epsilon \lt 0.5[/itex], if = 0.5, it is ambiguous, since the law of large numbers requires an open interval <0.5 around [itex]\epsilon[/itex].

I think we can safely assume it is less than 0.5.
 

1. What is a lower bound on binomial distribution?

A lower bound on binomial distribution is the minimum possible value for a given binomial random variable. It represents the lowest possible number of successes that can occur in a given number of trials.

2. How is the lower bound on binomial distribution calculated?

The lower bound on binomial distribution is calculated using the formula n*p, where n is the number of trials and p is the probability of success on each trial. This formula assumes that the probability of success is equal for each trial.

3. Why is the lower bound on binomial distribution important?

The lower bound on binomial distribution is important because it helps us understand the minimum number of successes that we can expect in a given number of trials. It can also be used to determine the probability of getting at least a certain number of successes.

4. Can the lower bound on binomial distribution be negative?

No, the lower bound on binomial distribution cannot be negative. This is because it represents the minimum possible number of successes, and a negative number of successes is not possible.

5. How does the lower bound on binomial distribution relate to the central limit theorem?

The central limit theorem states that as the number of trials increases, the binomial distribution approaches a normal distribution. The lower bound on binomial distribution is important in this context because it helps us understand the minimum number of successes that we can expect even as the number of trials gets very large.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
658
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
855
Back
Top