Proof of least Squares estimators

In summary, the conversation discusses the process of proving that choosing b0 and b1 to minimize a certain function results in the least squares estimators. It also touches on how to prove that the OLS (ordinary least squares) method minimizes the sum of squares function. The solution involves setting up the sum of squared errors, differentiating it with respect to beta, setting it to zero, and solving for beta.
  • #1
1
0
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg [Broken]
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg [Broken]

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
It's a standard maximization problem. Set up the sum of squared errors (SSE), differentiate with respect to beta, set to zero, solve for beta. For a maximum, verify that the second derivative at the beta value you found in the first step is negative.
 
  • #3
julion said:
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg [Broken]
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg [Broken]

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!

could you expand how to do that with a little bit more help please?
 
Last edited by a moderator:
  • #4
Treat

[tex]
S(b_0, b_1) = \sum_{i=1}^n \left(y_i - (b_0 + b_1 x_i)\right)^2
[/tex]

as a function of [tex] b_0 [/tex] and [tex] b_1 [/tex], and solve this system of equations - the solutions will give the formulas for the estimates of slope and intercept.

[tex]
\begin{align*}
\frac{\partial S}{\partial b_0} & = 0\\
\frac{\partial S}{\partial b_1} & = 0
\end{align*}
[/tex]
 
  • #5
thanks :)
 

Suggested for: Proof of least Squares estimators

Replies
2
Views
832
Replies
1
Views
884
Replies
23
Views
1K
Replies
2
Views
766
Replies
5
Views
1K
Replies
13
Views
1K
Replies
2
Views
1K
Replies
19
Views
1K
Back
Top