# Applying Chernoff bound on normal distribution

1. Dec 25, 2006

### phonic

Dear all,

I am trying to find out a good bound on the deveation of a normal distributed variable from its mean.

The noramly distributed variables $$X_t \sim N(\mu, \sigma^2), t= 1,2,...,n$$ are iid. Applying the Chebyshev inequality on the mean of these n iid variables:

$$m_n = \frac{1}{n} \sum_{t=1}^n X_t$$

$$P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}]$$

The question is how to calculate this expectaion
$$E[e^{s(m_n-\mu)^2}]$$

Can anybody give some hints? Thanks a lot!

Since $$m_n \sim N(\mu, \frac{\sigma^2}{n} )$$,
$$E[(m_n-\mu)^2}] = \frac{\sigma^2}{n}$$. But
$$E[e^{s(m_n-\mu)^2}]$$ seems not easy.

Phonic

Last edited: Dec 25, 2006
2. Dec 28, 2006

### EnumaElish

Are you treating s as a constant? Can you? Isn't it a r.v., e.g. s = sn?

3. Jan 8, 2007

### phonic

s can be considered as a constant number. Since the Markov inequality holds for any s.

Is there some bounds on the tail probability of a normal distribution?

4. Jan 8, 2007

### EnumaElish

But in $P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}]$, you have moved s out of the E[] in $e^{-s \epsilon^2}$ but then left it inside the E[] in $E[e^{s(m_n-\mu)^2}]$, is that legit? More to the point, can you also take it out of the latter, and if you can, would that make the job easier?

Last edited: Jan 8, 2007
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook