Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof of least Squares estimators

  1. May 13, 2009 #1
    Hey guys, long time lurker, first time poster!
    Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

    Im trying to prove that by choosing b0 and b1 to minimize
    http://img24.imageshack.us/img24/7/partas.jpg [Broken]
    you obtain the least squares estimators, namely:
    http://img15.imageshack.us/img15/3641/partbx.jpg [Broken]

    also just wondering how you can prove that OLS minimizes the sum of squares function.
    I know it has something to do with second derivatives, but im a bit stuck.
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. May 19, 2009 #2
    It's a standard maximization problem. Set up the sum of squared errors (SSE), differentiate with respect to beta, set to zero, solve for beta. For a maximum, verify that the second derivative at the beta value you found in the first step is negative.
  4. May 24, 2009 #3
    could you expand how to do that with a little bit more help please?
    Last edited by a moderator: May 4, 2017
  5. May 24, 2009 #4


    User Avatar
    Homework Helper


    S(b_0, b_1) = \sum_{i=1}^n \left(y_i - (b_0 + b_1 x_i)\right)^2

    as a function of [tex] b_0 [/tex] and [tex] b_1 [/tex], and solve this system of equations - the solutions will give the formulas for the estimates of slope and intercept.

    \frac{\partial S}{\partial b_0} & = 0\\
    \frac{\partial S}{\partial b_1} & = 0
  6. May 24, 2009 #5
    thanks :)
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook