It seems to me that Linear Regression and Linear Least Squares are often used interchangeably, but I believe there to be subtle differences between the two. From what I can tell (for simplicity let's assume the uncertainity is in y only),(adsbygoogle = window.adsbygoogle || []).push({}); refers to the general case of fitting a straight line to a set of data, but the method of determining optimal fit can be most anything (e.g. sum of vertical differences, sum of absolute value of vertical differences, max vertical difference, sum of square of vertical differences, etc.), whereasLinear Regressionrefers to a specific measure of optimal fit, namely, sum of the square of vertical differences.Linear Least Squares

Actually, it seems to me that Linear Least Squares doesn't necessarily mean that you are fitting a straight line to the data, it just means that the modelling function is linear in the unknowns (e.g. [itex]y = ax^2 + bx + c[/itex] is linear in a, b, and c). Perhaps it is established convention that Linear Least Squares does, in fact, refer to fitting a straight line, whereasis the more general case?Least Squares

Lastly,refers to cases where the modelling function is not linear in the unknowns (e.g. [itex]y = e^{-ax^b}[/itex], where a,b are sought).Non-linear Least Squares

Is my understanding correct on this?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Linear Regression, Linear Least Squares, Least Squares, Non-linear Least Squares

**Physics Forums | Science Articles, Homework Help, Discussion**