Linear Regression and OLS

  • I
  • Thread starter fog37
  • Start date
  • #1
fog37
1,382
95
TL;DR Summary
Understanding if linear regression can be done with other variants of least squares
Hello,

Simple linear regression aims at finding the slope and intercept of the best-fit line to for a pair of ##X## and ##Y## variables.
In general, the optimal intercept and slope are found using OLS. However, I learned that "O" means ordinary and there are other types of least square computations...

Question: is it possible to apply those variants of LS to the linear regression model, i.e., can we find the best-fit line parameters using something other than OLS (for example, I think there is "robust" LS, etc.)?

thank you!
 

Answers and Replies

  • #2
scottdave
Science Advisor
Homework Helper
Insights Author
Gold Member
1,860
848
I have taken a course in regression and some courses that use regression techniques. From what I remember, the Ordinary in OLS refers to some assumptions we make, rather than the method
one assumption is: the residuals are randomly distributed.

I like this textbook (free download) https://www.statlearning.com/
 
  • #3
FactChecker
Science Advisor
Homework Helper
Gold Member
7,590
3,314
Simple linear regression aims at finding the slope and intercept of the best-fit line to for a pair of ##X## and ##Y## variables.
More precisely, it finds the line that uses the ##X## value to estimate the ##Y## values with the minimum sum-squared-errors for the ##Y## estimates. The phrase "best-fit line" can mean something different, referring to minimizing the sum-squared perpendicular distances from the data to the line.
In general, the optimal intercept and slope are found using OLS. However, I learned that "O" means ordinary and there are other types of least square computations...

Question: is it possible to apply those variants of LS to the linear regression model, i.e., can we find the best-fit line parameters using something other than OLS (for example, I think there is "robust" LS, etc.)?
This is an interesting question. I am not an expert in this, but I see ( https://en.wikipedia.org/wiki/Robust_regression ) that there are attempts to decrease the influence of outliers. Some methods have been implemented in R (see https://stat.ethz.ch/R-manual/R-patched/library/MASS/html/rlm.html ). I don't know if that implementation is publicly available. It is applied in an example in https://stats.oarc.ucla.edu/r/dae/robust-regression/
 
  • #4
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,466
1,397
Even least squares is not necessary. You can find a slope and intercept that minimize any penalty function you want.
 
  • Like
Likes fog37, scottdave and FactChecker
  • #5
FactChecker
Science Advisor
Homework Helper
Gold Member
7,590
3,314
Even least squares is not necessary. You can find a slope and intercept that minimize any penalty function you want.
Good point, although most penalty functions would require non-analytical iterative minimization algorithms that are less intuitive. Also, I do not know what the risk of introducing local minimums would be.
 
  • #6
Stephen Tashi
Science Advisor
7,781
1,540
In general, the optimal intercept and slope are found using OLS.

When you read about this, what was the definition of "optimal"?

(A mathematical definition can be given in the context of statistical estimation and the properties of estimators.)
 
  • Like
Likes FactChecker

Suggested for: Linear Regression and OLS

  • Last Post
Replies
3
Views
83
  • Last Post
Replies
3
Views
517
Replies
5
Views
481
Replies
0
Views
1K
  • Last Post
Replies
8
Views
1K
Replies
2
Views
665
  • Last Post
2
Replies
64
Views
2K
  • Last Post
Replies
1
Views
1K
Replies
6
Views
480
  • Last Post
Replies
9
Views
569
Top