
#1
Feb712, 12:42 AM

P: 2

So in computing the variancecovariance matrix for βhat in an OLS model, we arrive at
VarCov(βhat)=(σ_ε)^2E{[X'X]^1} However, I'm incredulous as to how X is considered nonstochastic and how we can just eliminate the expectation sign and have VarCov(βhat)=(σ_ε)^2[X'X]^1 I'm accepting this to be true (since it's so written in the text) but I'm taking a leap of faith here: if this is true, the elements in the VarCov matrix are expressed in terms of sample statistics and are therefore stochastic. I thought that the variance of an estimator of a parameter, if consistent, should be a deterministic parameter itself and should not depend on the sample observations (besides sample size, n), such as the ones we see in using CramerRao lower bound to determine efficiency. Likely I'm understanding something wrong here, any pointers would be greatly appreciated! 



#2
Feb712, 03:31 AM

Sci Advisor
P: 3,175

You haven't clearly stated a mathematical question. Is X supposed to be the vector of independent variables? If so, they aren't considered to be stochastic if you compute the regression so it minimizes the least square error in predicting the dependent variables, which are "Y", by tradition. If you have data of the form (X,Y) and there are "errors" in both X and Y, you should use a "total least squares" model.




#3
Feb712, 03:37 AM

P: 4,570

Hey chevrox and welcome to the forums.
Like Stephen Tashi I am going wait for clarification of what your variables are but I did want to comment on one thing you said: 



#4
Feb812, 12:45 AM

P: 2

Having trouble understanding variance of OLS estimator
Thanks for the replies! Yes, X is the nxk matrix of explanatory variables such that y=Xβ+ε. I think I understand it now. Variables in X do not necessarily follow a stochastic process, and even if they do, since all variability of y is explained by ε in the model, the independent variables affect the dependent variable solely through their observed values rather than a range of distribution where those values could fall, and it is therefore considered nonstochastic. And meanwhile βhat does not lose its consistency since E(βhat)=β (which is possible only if X is nonstochastic) and Var(βhat)→0 even though Var(βhat) varies with sample.



Register to reply 
Related Discussions  
A little help understanding standard deviation & variance  Set Theory, Logic, Probability, Statistics  5  
Estimator for variance when sampling without replacement  Calculus & Beyond Homework  5  
expected value nd variance of mean estimator  Calculus & Beyond Homework  6  
Why is the variance of the Parzen density estimator infinite?  Set Theory, Logic, Probability, Statistics  1  
unbiased estimator of variance  Set Theory, Logic, Probability, Statistics  2 