Applying Chernoff bound on normal distribution

Click For Summary

Discussion Overview

The discussion revolves around applying the Chernoff bound to a normally distributed variable and finding a bound on the deviation of the sample mean from the true mean. Participants explore the mathematical implications of using the Chernoff bound in this context, particularly focusing on the expectation term involved.

Discussion Character

  • Mathematical reasoning
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant seeks to calculate the expectation \( E[e^{s(m_n-\mu)^2}] \) for the sample mean \( m_n \) of iid normal variables.
  • Another participant questions whether \( s \) can be treated as a constant, suggesting it may be a random variable instead.
  • A different participant asserts that \( s \) can be considered a constant, referencing the applicability of the Markov inequality.
  • Concerns are raised about the legitimacy of moving \( s \) out of the expectation in the context of the inequality presented, questioning whether this affects the validity of the approach.

Areas of Agreement / Disagreement

Participants express differing views on the treatment of \( s \) in the calculations, leading to unresolved questions about the legitimacy of certain mathematical manipulations. There is no consensus on how to proceed with the expectation calculation or the implications of treating \( s \) as a constant versus a random variable.

Contextual Notes

Participants highlight potential limitations in the approach, particularly regarding the treatment of \( s \) and the handling of the expectation term. The discussion remains focused on the mathematical intricacies without resolving these issues.

phonic
Messages
28
Reaction score
0
Dear all,

I am trying to find out a good bound on the deveation of a normal distributed variable from its mean.

The noramly distributed variables X_t \sim N(\mu, \sigma^2), t= 1,2,...,n are iid. Applying the Chebyshev inequality on the mean of these n iid variables:

m_n = \frac{1}{n} \sum_{t=1}^n X_t

P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}]

The question is how to calculate this expectaion
E[e^{s(m_n-\mu)^2}]

Can anybody give some hints? Thanks a lot!

Since m_n \sim N(\mu, \frac{\sigma^2}{n} ),
E[(m_n-\mu)^2}] = \frac{\sigma^2}{n}. But
E[e^{s(m_n-\mu)^2}] seems not easy.

Phonic
 
Last edited:
Physics news on Phys.org
Are you treating s as a constant? Can you? Isn't it a r.v., e.g. s = sn?
 
s can be considered as a constant number. Since the Markov inequality holds for any s.

Is there some bounds on the tail probability of a normal distribution?
 
But in P(m_n - \mu \geq \epsilon ) = \frac{1}{2} P(e^{s(m_n - \mu)^2} \geq e^{s\epsilon^2} ) \leq \frac{1}{2}e^{-s \epsilon^2} E[e^{s(m_n-\mu)^2}], you have moved s out of the E[] in e^{-s \epsilon^2} but then left it inside the E[] in E[e^{s(m_n-\mu)^2}], is that legit? More to the point, can you also take it out of the latter, and if you can, would that make the job easier?
 
Last edited:

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
6K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K