Probability inequality for the sum of independent normal random variables

phonic
Messages
28
Reaction score
0
Dear all,

I wonder wheather there exsits a probability inequality for the sum of independent normal random variables (X_i are i.i.d. normal random varianble with mean \mu and variance \sigma^2):
<br /> P\left(\frac{1}{n}\sum_{i=1}^n X_i - \mu&gt; \epsilon\right)\leq<br /> f(\epsilon, \sigma^2,n) \right).<br />

We know that Bernstein inequality is for the sum of bounded random variables:
<br /> P\left(\frac{1}{n}\sum_{i=1}^n X_i -\mu &gt; \epsilon\right)\leq<br /> \exp\left(-\frac{n\epsilon^2}{2\sigma^2+ 2c\epsilon/3} \right).<br />

I wonder whether there is some similar inequality for normal variables.

Thanks!

Phonic
 
Physics news on Phys.org
There is an exact equality; it follows from Σ X/n ~ N(μ, σ^2/n).
 
Last edited:
Tanks for your reply. Then the problem is to bound the tail probability of this normal variable. I know one inequality is (R. D. Gordon, The Annals of Mathematical Statistics, 1941(12), pp 364-366)
<br /> P(z \geq x) = \int_x^\infty \frac{1}{\sqrt{2\pi}}<br /> e^{-\frac{1}{2}z^2} dz \leq \frac{1}{x}<br /> \frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}x^2}\mbox{\hspace{1cm}for } x&gt;0,<br />
where z is a standard normal variable.

The problem of this inequality is that the function \frac{1}{x}<br /> e^{-\frac{1}{2}x^2} is nor invertible (no analytical inverse function). Do you know some other bound for tail probability of a normal variable? Thanks a lot!

EnumaElish said:
There is an exact equality; it follows from Σ X/n ~ N(μ, σ^2/n).
 
Haven't you changed the upper bound function? Can the new function not have σ^2 or n as arguments? If it can, then you have an exact statement of the tail probability.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top