Probability inequality for the sum of independent normal random variables

In summary, the conversation discusses the existence of a probability inequality for the sum of independent normal random variables, where one person wonders if there is a similar inequality for normal variables. Another person mentions an exact equality and a problem with an inequality in terms of its inverse function. They also discuss changing the upper bound function to have σ^2 and n as arguments in order to have an exact statement of the tail probability.
  • #1
phonic
28
0
Dear all,

I wonder wheather there exsits a probability inequality for the sum of independent normal random variables ([itex]X_i[/itex] are i.i.d. normal random varianble with mean [itex]\mu[/itex] and variance [itex]\sigma^2[/itex]):
[itex]
P\left(\frac{1}{n}\sum_{i=1}^n X_i - \mu> \epsilon\right)\leq
f(\epsilon, \sigma^2,n) \right).
[/itex]

We know that Bernstein inequality is for the sum of bounded random variables:
[itex]
P\left(\frac{1}{n}\sum_{i=1}^n X_i -\mu > \epsilon\right)\leq
\exp\left(-\frac{n\epsilon^2}{2\sigma^2+ 2c\epsilon/3} \right).
[/itex]

I wonder whether there is some similar inequality for normal variables.

Thanks!

Phonic
 
Physics news on Phys.org
  • #2
There is an exact equality; it follows from Σ X/n ~ N(μ, σ^2/n).
 
Last edited:
  • #3
Tanks for your reply. Then the problem is to bound the tail probability of this normal variable. I know one inequality is (R. D. Gordon, The Annals of Mathematical Statistics, 1941(12), pp 364-366)
[itex]
P(z \geq x) = \int_x^\infty \frac{1}{\sqrt{2\pi}}
e^{-\frac{1}{2}z^2} dz \leq \frac{1}{x}
\frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}x^2}\mbox{\hspace{1cm}for } x>0,
[/itex]
where z is a standard normal variable.

The problem of this inequality is that the function [itex]\frac{1}{x}
e^{-\frac{1}{2}x^2} [/itex] is nor invertible (no analytical inverse function). Do you know some other bound for tail probability of a normal variable? Thanks a lot!

EnumaElish said:
There is an exact equality; it follows from Σ X/n ~ N(μ, σ^2/n).
 
  • #4
Haven't you changed the upper bound function? Can the new function not have σ^2 or n as arguments? If it can, then you have an exact statement of the tail probability.
 

1. What is the probability inequality for the sum of independent normal random variables?

The probability inequality for the sum of independent normal random variables is known as the Central Limit Theorem. It states that as the number of independent random variables increases, the sum of those variables will tend towards a normal distribution.

2. How does the Central Limit Theorem apply to real-world scenarios?

The Central Limit Theorem is applicable in many real-world scenarios, such as in finance, where the sum of many small random fluctuations can result in a normally distributed outcome. It is also used in quality control and statistical process control to determine if a process is operating within acceptable limits.

3. Are there any assumptions that need to be met for the Central Limit Theorem to apply?

Yes, the Central Limit Theorem assumes that the random variables are independent of each other, and that their sum is finite. Additionally, the variables should be identically distributed and have a finite variance.

4. How can the probability inequality for the sum of independent normal random variables be used in decision making?

The probability inequality for the sum of independent normal random variables can be used to calculate the probability of certain outcomes and make decisions based on that probability. For example, in business, it can be used to determine the likelihood of achieving a certain level of profit or loss.

5. Are there any limitations to the Central Limit Theorem?

While the Central Limit Theorem is a powerful tool in probability and statistics, it does have some limitations. It may not apply to non-normal distributions, and the convergence to a normal distribution may be slow for small sample sizes. Additionally, the theorem assumes that the sample size is large enough, and if this assumption is not met, the results may not be accurate.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
990
Replies
0
Views
264
  • Set Theory, Logic, Probability, Statistics
2
Replies
39
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
815
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
2K
Back
Top