Applying Chernoff bound on normal distribution

  • Thread starter phonic
  • Start date
28
0
Dear all,

I am trying to find out a good bound on the deveation of a normal distributed variable from its mean.

The noramly distributed variables [tex]X_t \sim N(\mu, \sigma^2), t= 1,2,...,n [/tex] are iid. Applying the Chebyshev inequality on the mean of these n iid variables:

[tex]m_n = \frac{1}{n} \sum_{t=1}^n X_t[/tex]

[tex]P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}] [/tex]

The question is how to calculate this expectaion
[tex] E[e^{s(m_n-\mu)^2}] [/tex]

Can anybody give some hints? Thanks a lot!

Since [tex]m_n \sim N(\mu, \frac{\sigma^2}{n} ) [/tex],
[tex] E[(m_n-\mu)^2}] = \frac{\sigma^2}{n} [/tex]. But
[tex] E[e^{s(m_n-\mu)^2}] [/tex] seems not easy.

Phonic
 
Last edited:

EnumaElish

Science Advisor
Homework Helper
2,285
123
Are you treating s as a constant? Can you? Isn't it a r.v., e.g. s = sn?
 
28
0
s can be considered as a constant number. Since the Markov inequality holds for any s.

Is there some bounds on the tail probability of a normal distribution?
 

EnumaElish

Science Advisor
Homework Helper
2,285
123
But in [itex]P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}] [/itex], you have moved s out of the E[] in [itex]e^{-s \epsilon^2} [/itex] but then left it inside the E[] in [itex]E[e^{s(m_n-\mu)^2}][/itex], is that legit? More to the point, can you also take it out of the latter, and if you can, would that make the job easier?
 
Last edited:

Related Threads for: Applying Chernoff bound on normal distribution

Replies
1
Views
854
  • Posted
Replies
2
Views
1K
  • Posted
Replies
1
Views
1K
Replies
4
Views
704
Replies
15
Views
1K
Replies
12
Views
9K
  • Posted
Replies
1
Views
2K
  • Posted
Replies
2
Views
2K
Top