# Proof of least Squares estimators

Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg [Broken]

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but im a bit stuck.
Thanks!

Last edited by a moderator:

Related Set Theory, Logic, Probability, Statistics News on Phys.org
It's a standard maximization problem. Set up the sum of squared errors (SSE), differentiate with respect to beta, set to zero, solve for beta. For a maximum, verify that the second derivative at the beta value you found in the first step is negative.

Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg [Broken]

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but im a bit stuck.
Thanks!
could you expand how to do that with a little bit more help please?

Last edited by a moderator:
Homework Helper
Treat

$$S(b_0, b_1) = \sum_{i=1}^n \left(y_i - (b_0 + b_1 x_i)\right)^2$$

as a function of $$b_0$$ and $$b_1$$, and solve this system of equations - the solutions will give the formulas for the estimates of slope and intercept.

\begin{align*} \frac{\partial S}{\partial b_0} & = 0\\ \frac{\partial S}{\partial b_1} & = 0 \end{align*}

thanks :)