Proof of least Squares estimators

In summary, the conversation discusses the process of proving that choosing b0 and b1 to minimize a certain function results in the least squares estimators. It also touches on how to prove that the OLS (ordinary least squares) method minimizes the sum of squares function. The solution involves setting up the sum of squared errors, differentiating it with respect to beta, setting it to zero, and solving for beta.
  • #1
julion
1
0
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
It's a standard maximization problem. Set up the sum of squared errors (SSE), differentiate with respect to beta, set to zero, solve for beta. For a maximum, verify that the second derivative at the beta value you found in the first step is negative.
 
  • #3
julion said:
Hey guys, long time lurker, first time poster!
Just having some trouble with something..Im probably just looking at it the wrong way, but I was wondering if anyone could help me with this..

Im trying to prove that by choosing b0 and b1 to minimize
http://img24.imageshack.us/img24/7/partas.jpg
you obtain the least squares estimators, namely:
http://img15.imageshack.us/img15/3641/partbx.jpg

also just wondering how you can prove that OLS minimizes the sum of squares function.
I know it has something to do with second derivatives, but I am a bit stuck.
Thanks!

could you expand how to do that with a little bit more help please?
 
Last edited by a moderator:
  • #4
Treat

[tex]
S(b_0, b_1) = \sum_{i=1}^n \left(y_i - (b_0 + b_1 x_i)\right)^2
[/tex]

as a function of [tex] b_0 [/tex] and [tex] b_1 [/tex], and solve this system of equations - the solutions will give the formulas for the estimates of slope and intercept.

[tex]
\begin{align*}
\frac{\partial S}{\partial b_0} & = 0\\
\frac{\partial S}{\partial b_1} & = 0
\end{align*}
[/tex]
 
  • #5
thanks :)
 

What is the purpose of a "Proof of Least Squares Estimators"?

The purpose of a "Proof of Least Squares Estimators" is to provide mathematical evidence that the least squares method is an optimal way to estimate the parameters of a linear regression model. It shows that the estimated coefficients obtained through the least squares method minimize the sum of squared residuals, making them the best fit for the data.

How is the proof of least squares estimators derived?

The proof of least squares estimators is derived using calculus, specifically the method of partial derivatives. By setting the partial derivatives of the sum of squared residuals to zero, we can find the values of the estimated coefficients that minimize the sum of squared residuals, thus proving their optimality.

What assumptions are made in the proof of least squares estimators?

The proof of least squares estimators assumes that the error terms in the linear regression model are normally distributed with a mean of zero and constant variance. It also assumes that the error terms are independent of each other and of the independent variables.

Are there any limitations to the proof of least squares estimators?

The proof of least squares estimators assumes that the independent variables are fixed and not subject to any measurement errors. This can be a limitation in real-world applications where the independent variables may have some degree of uncertainty. Additionally, the proof assumes that the model is linear and may not hold for non-linear models.

How is the proof of least squares estimators used in practice?

The proof of least squares estimators is used to support the use of the least squares method in estimating the parameters of a linear regression model. It provides a solid mathematical foundation for the method and helps to justify its use in statistical analysis and data modeling. Additionally, it can be used to evaluate the validity of the assumptions made in the proof and make any necessary adjustments to the model if those assumptions are not met.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
7K
  • Science and Math Textbooks
Replies
3
Views
1K
Replies
2
Views
929
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
15K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
Replies
7
Views
1K
  • STEM Academic Advising
Replies
21
Views
5K
  • General Math
Replies
5
Views
3K
Back
Top