
#1
Nov511, 10:46 PM

P: 7

I do not really understand what is an unbiased estimator during my statistic studies ~~ THANKS ~~




#2
Nov511, 11:07 PM

P: 771

This should be in the statistics forum, but since I can answer it...
Remember that estimators are random variables; an estimator is "unbiased" if its expected value is equal to the true value of the parameter being estimated. To use regression as an example... Suppose you measured two variables x and y where the true (linear) relationship is given by...[tex]y = 5x + 2[/tex]Of course, any sample that you draw will have noise in it, so you have to estimate the true values of the slope and intercept. Suppose that you draw a thousand samples of x,y and calculate the least squares estimators for each sample (assuming that the noise is normally distributed). As you do that, you'll notice two things... 1) All of your estimates are different (because the data is noisy) 2) The mean of all of those estimates starts to converge on the true values (5 and 2) The second occurs because the estimator is unbiased. 



#4
Nov611, 12:35 AM

P: 7

What is an unbiased estimator ??
any sample that you draw will have noise in it, > what does the noise mean by ??
mean like a OnetoOne function ?? 



#5
Nov611, 12:38 AM

P: 4,570

Lets say you have a parameter, for simplicity lets say its the mean. Now the mean has a distribution based on a sample. Using your sample, you are trying to estimate a parameter: this is why we call these things estimators because they are estimating something. Unbiased estimators have the property that the expectation of the sampling distribution algebraically equals the parameter: in other words the expectation of our estimator random variable gives us the parameter. If it doesn't, then the estimator is called unbiased. Of course, we want estimators that are unbiased because statistically they will give us an estimate that is close to what it should be. Also the key thing is that the estimate stays the same even when the sample grows. You will learn that an estimator should be consistent which basically means that the variance of the estimator goes to zero as the sample size goes to infinity. 



#6
Nov611, 12:53 AM

P: 7

since unbiased estimator = mean/paramete , we want it to be close to the mean for what purposes ?? 



#7
Nov611, 01:05 AM

P: 4,570

So our estimator is a random variable that is a function of other random variables. Now our estimator random variable is the actual distribution for the parameter we are estimating. When you start off, we look at estimating things like means and variances, but we can create estimators that are really complicated if we want to: it uses the same idea as the mean and the variance but it measures something else of interest. But with regards to unbiasedness, if our estimator was unbiased, then if we used statistical theory, we may not get the right intervals for the actual parameter. Just think about if you had an estimator, and you did a 95% confidence interval that was really unbiased (lets say five standard deviations away from estimator mean): it wouldn't be useful using that estimator would it? Might as well not use an estimator at all if that is all you had. So yeah in response to being close to the parameter, yes that is what we want. We want the mean for the estimator random variable to be the same, and to be the parameter of interest no matter what the sample is and no matter how big the sample is. 



#8
Nov711, 05:52 PM

Sci Advisor
P: 3,175




Register to reply 
Related Discussions  
unbiased estimator  Set Theory, Logic, Probability, Statistics  2  
Proving an unbiased estimator  Calculus & Beyond Homework  11  
unbiased estimator of variance  Set Theory, Logic, Probability, Statistics  2  
Finding an Unbiased estimator  Precalculus Mathematics Homework  0  
unbiased estimator  Calculus & Beyond Homework  2 