Applying Chernoff bound on normal distribution

Click For Summary
SUMMARY

This discussion focuses on applying the Chernoff bound to a normally distributed variable, specifically addressing the deviation of the sample mean from the population mean. The variables in question are independent and identically distributed (iid) as X_t ~ N(μ, σ²). The user seeks to compute the expectation E[e^(s(m_n - μ)²)], where m_n is the sample mean. The conversation highlights the challenge of handling the parameter s, whether treated as a constant or a random variable, and the implications for applying the Markov inequality.

PREREQUISITES
  • Understanding of normal distribution, specifically N(μ, σ²)
  • Familiarity with Chebyshev's inequality and its applications
  • Knowledge of the Chernoff bound and its derivation
  • Basic concepts of expectation and variance in probability theory
NEXT STEPS
  • Research the derivation of the Chernoff bound for normal distributions
  • Learn about the properties of expectations, particularly E[e^(X)] for normally distributed variables
  • Explore advanced applications of the Markov inequality in probability theory
  • Investigate tail bounds for normal distributions and their implications
USEFUL FOR

This discussion is beneficial for statisticians, data scientists, and mathematicians interested in probability theory, particularly those working with normal distributions and bounds on deviations from the mean.

phonic
Messages
28
Reaction score
0
Dear all,

I am trying to find out a good bound on the deveation of a normal distributed variable from its mean.

The noramly distributed variables X_t \sim N(\mu, \sigma^2), t= 1,2,...,n are iid. Applying the Chebyshev inequality on the mean of these n iid variables:

m_n = \frac{1}{n} \sum_{t=1}^n X_t

P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}]

The question is how to calculate this expectaion
E[e^{s(m_n-\mu)^2}]

Can anybody give some hints? Thanks a lot!

Since m_n \sim N(\mu, \frac{\sigma^2}{n} ),
E[(m_n-\mu)^2}] = \frac{\sigma^2}{n}. But
E[e^{s(m_n-\mu)^2}] seems not easy.

Phonic
 
Last edited:
Physics news on Phys.org
Are you treating s as a constant? Can you? Isn't it a r.v., e.g. s = sn?
 
s can be considered as a constant number. Since the Markov inequality holds for any s.

Is there some bounds on the tail probability of a normal distribution?
 
But in P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}], you have moved s out of the E[] in e^{-s \epsilon^2} but then left it inside the E[] in E[e^{s(m_n-\mu)^2}], is that legit? More to the point, can you also take it out of the latter, and if you can, would that make the job easier?
 
Last edited:

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K