Mean, variance of non-parametric estimator

  • Thread starter Thread starter rayge
  • Start date Start date
  • Tags Tags
    Mean Variance
Click For Summary
SUMMARY

The discussion focuses on the mean and variance of the non-parametric estimator \(\hat{f}(x)=\frac{1}{2hn}\sum\limits_{i=1}^n I_i(x)\) for a probability density function (pdf). The mean is derived as \(E[\hat{f}(x)] = \frac{1}{2h}P(x_h PREREQUISITES

  • Understanding of non-parametric estimators
  • Familiarity with probability density functions (pdf)
  • Knowledge of expectation and bias in statistics
  • Basic calculus, particularly integration
NEXT STEPS
  • Study the properties of non-parametric estimators in statistical analysis
  • Learn about calculating bias and variance for estimators
  • Explore integral calculus as it applies to probability density functions
  • Review the concept of convergence in probability as \(h\) approaches 0
USEFUL FOR

Students and researchers in statistics, particularly those focusing on non-parametric methods, as well as educators seeking to clarify concepts related to estimators and their properties.

rayge
Messages
25
Reaction score
0

Homework Statement


For the nonparameteric estimator \hat{f}(x)=\frac{1}{2hn}\sum\limits_{i=1}^n I_i(x) of a pdf,
(a) Obtain its mean and determine the bias of the estimator
(b) Obtain its variance

Homework Equations



The Attempt at a Solution


For (a), I think it goes like this:
E[\hat{f}(x)] = E[\frac{1}{2hn}\sum\limits_{i=1}^n I_i(x)]=\frac{1}{2hn}nE[I_1(X)]=\frac{1}{2h}P(x_h<X<x+h)=\frac{1}{2h}\int_{x-h}^{x+h}f(x)dx=f(\xi_n)

However my professor stated he wants the answer in terms of integrals of f. He doesn't want the answer in terms of \xi, as the function f(\xi) is existential. I'm confused about what he is asking for. (I should probably ask him for clarification rather than strangers, but maybe someone sees something in this that I'm missing.) Maybe it has something to do with taking the limit as h goes to 0; but in this case, I think we get f(x), which is somewhat trivial and not really helpful.

As for obtaining the bias, I think I need to find E[\hat{f}(x)] - \mu. Which I also think becomes f(x) - \mu, which is also trivial.

I think once I understand the key to this I'll understand how to get part (b), but I'm kind of lost. Thanks for any pointers.
 
Physics news on Phys.org
hmm. I don't understand your working. Each ##I_i## is going to be centred at a different place, corresponding to each of your data points, right? So each ##I_i## is a different function, so the expectation is not going to be the same for all of them.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
2
Views
1K
Replies
6
Views
1K
Replies
2
Views
2K