Proof of least Squares estimators

Click For Summary

Discussion Overview

The discussion revolves around proving the least squares estimators in the context of ordinary least squares (OLS) regression. Participants are exploring the mathematical foundations of minimizing the sum of squared errors and the conditions under which OLS achieves this minimization.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Homework-related

Main Points Raised

  • One participant expresses difficulty in proving that minimizing a specific function leads to the least squares estimators.
  • Another participant suggests treating the sum of squared errors as a function of the parameters and solving the resulting system of equations to find the estimates for slope and intercept.
  • There is mention of the need to verify the second derivative condition for a maximum, although the details of this verification are not fully explored.
  • A request for further clarification on the process of proving that OLS minimizes the sum of squares function is made, indicating a desire for more detailed guidance.

Areas of Agreement / Disagreement

Participants appear to be exploring the same problem, but there is no explicit consensus on the methods or steps to be taken to prove the claims. The discussion remains unresolved regarding the specifics of the proof.

Contextual Notes

Limitations include the lack of detailed steps in the differentiation process and the conditions under which the second derivative test is applied. The discussion does not resolve the assumptions or definitions necessary for the proofs being sought.

julion
Messages
1
Reaction score
0
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
It's a standard maximization problem. Set up the sum of squared errors (SSE), differentiate with respect to beta, set to zero, solve for beta. For a maximum, verify that the second derivative at the beta value you found in the first step is negative.
 
julion said:
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!

could you expand how to do that with a little bit more help please?
 
Last edited by a moderator:
Treat

<br /> S(b_0, b_1) = \sum_{i=1}^n \left(y_i - (b_0 + b_1 x_i)\right)^2<br />

as a function of b_0 and b_1, and solve this system of equations - the solutions will give the formulas for the estimates of slope and intercept.

<br /> \begin{align*}<br /> \frac{\partial S}{\partial b_0} &amp; = 0\\<br /> \frac{\partial S}{\partial b_1} &amp; = 0<br /> \end{align*}<br />
 
thanks :)
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
9K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
16K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 21 ·
Replies
21
Views
6K
  • · Replies 5 ·
Replies
5
Views
7K