• Support PF! Buy your school textbooks, materials and every day products Here!

Least square estimate problem

  • Thread starter squenshl
  • Start date
  • #1
479
4

Homework Statement


Suppose that ##Y \sim N_n\left(X\beta,\sigma^2I\right)##, where the density function of ##Y## is
$$\frac{1}{\left(2\pi\sigma^2\right)^{\frac{n}{2}}}e^{-\frac{1}{2\sigma^2}(Y-X\beta)^T(Y-X\beta)},$$
and ##X## is an ##n\times p## matrix of rank ##p##.
Let ##\hat{\beta}## be the least squares estimator of ##\beta##.

Show that ##(Y-X\beta)^T(Y-X\beta) = \left(Y-X\hat{\beta}\right)^T(Y-X\hat{\beta})+\left(\hat{\beta}-\beta\right)^TX^TX\left(\hat{\beta}-\beta\right)## and therefore that ##\hat{\beta}## is the least squares estimate.
Hint: ##Y-X\beta = Y-X\hat{\beta}+X\hat{\beta}-X\beta##.

Homework Equations




The Attempt at a Solution


I have no idea where to start. Do I substitute the hint into ##(Y-X\beta)^T(Y-X\beta)## and expand out the brackets???

Please help!!!
 

Answers and Replies

  • #2
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,792
1,390
There seems to be something odd about how this problem is stated. It asks the student to assume that ##\hat\beta## is the least squares estimator of ##\beta## - and then to use that to prove that it is the least squares estimate. Are they trying to draw a distinction between estimator and estimate? If not, the problem is trivial. However if we want to get very precise about terminology I would have thought that an estimator is a function whereas the estimate is the result of the function. Is there some particular meaning of 'estimator' and 'estimate' that they are using in your course?

As to how to proceed to prove their formula, yes substitution along the lines you mention sounds a good way to start. You can rewrite the RHS of the hint as ##(Y-X\hat\beta)+X(\hat\beta-\beta)##. Expanding out then gives us a right hand side that is what they show above, plus
$$2(X(\hat\beta-\beta))^T(Y-X\hat\beta)$$
So this needs to be shown to be zero. However it seems to me that should be impossible, since it is a function of the unknown parameter vector ##\beta##, which can be changed without changing any of the other elements in the formula (##X,Y,\hat\beta##) .

Are you sure there wasn't an expectation operator around that equation they want you to prove, or some other constraining condition?
 
  • #3
479
4
There seems to be something odd about how this problem is stated. It asks the student to assume that ##\hat\beta## is the least squares estimator of ##\beta## - and then to use that to prove that it is the least squares estimate. Are they trying to draw a distinction between estimator and estimate? If not, the problem is trivial. However if we want to get very precise about terminology I would have thought that an estimator is a function whereas the estimate is the result of the function. Is there some particular meaning of 'estimator' and 'estimate' that they are using in your course?

As to how to proceed to prove their formula, yes substitution along the lines you mention sounds a good way to start. You can rewrite the RHS of the hint as ##(Y-X\hat\beta)+X(\hat\beta-\beta)##. Expanding out then gives us a right hand side that is what they show above, plus
$$2(X(\hat\beta-\beta))^T(Y-X\hat\beta)$$
So this needs to be shown to be zero. However it seems to me that should be impossible, since it is a function of the unknown parameter vector ##\beta##, which can be changed without changing any of the other elements in the formula (##X,Y,\hat\beta##) .

Are you sure there wasn't an expectation operator around that equation they want you to prove, or some other constraining condition?
Nope that's the question asked.
 
  • #4
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728

Homework Statement


Suppose that ##Y \sim N_n\left(X\beta,\sigma^2I\right)##, where the density function of ##Y## is
$$\frac{1}{\left(2\pi\sigma^2\right)^{\frac{n}{2}}}e^{-\frac{1}{2\sigma^2}(Y-X\beta)^T(Y-X\beta)},$$
and ##X## is an ##n\times p## matrix of rank ##p##.
Let ##\hat{\beta}## be the least squares estimator of ##\beta##.

Show that ##(Y-X\beta)^T(Y-X\beta) = \left(Y-X\hat{\beta}\right)^T(Y-X\hat{\beta})+\left(\hat{\beta}-\beta\right)^TX^TX\left(\hat{\beta}-\beta\right)## and therefore that ##\hat{\beta}## is the least squares estimate.
Hint: ##Y-X\beta = Y-X\hat{\beta}+X\hat{\beta}-X\beta##.

Homework Equations




The Attempt at a Solution


I have no idea where to start. Do I substitute the hint into ##(Y-X\beta)^T(Y-X\beta)## and expand out the brackets???

Please help!!!
Let ##Q(\beta) = (Y - X \beta)^T (Y - X \beta)##. If you write ##\beta = b + e## you can expand ##Q(b+e)## as a quadratic in ##e##. It will have 0-order terms (not containing ##e##), first-order terms (linear in ##e##) and second-order terms (of the form ##e^T M e## for some matrix ##M## that depends on ##X, Y## and ##b##). However, if you choose ##b## correctly, the terms of first-order in ##e## will vanish, leaving you with only zero-order and second-order terms in ##e##. That will happen when ##b = \hat{\beta}##. You will obtain the expression you are being asked to prove, where ##e = \beta ##-
 

Related Threads on Least square estimate problem

  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
4
Views
869
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
1
Views
3K
Replies
2
Views
964
Replies
3
Views
1K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
0
Views
715
  • Last Post
Replies
1
Views
2K
Top