It seems to me that Linear Regression and Linear Least Squares are often used interchangeably, but I believe there to be subtle differences between the two. From what I can tell (for simplicity let's assume the uncertainity is in y only),

Actually, it seems to me that Linear Least Squares doesn't necessarily mean that you are fitting a straight line to the data, it just means that the modelling function is linear in the unknowns (e.g. [itex]y = ax^2 + bx + c[/itex] is linear in a, b, and c). Perhaps it is established convention that Linear Least Squares does, in fact, refer to fitting a straight line, whereas

Lastly,

Is my understanding correct on this?

**refers to the general case of fitting a straight line to a set of data, but the method of determining optimal fit can be most anything (e.g. sum of vertical differences, sum of absolute value of vertical differences, max vertical difference, sum of square of vertical differences, etc.), whereas***Linear Regression**refers to a specific measure of optimal fit, namely, sum of the square of vertical differences.***Linear Least Squares**Actually, it seems to me that Linear Least Squares doesn't necessarily mean that you are fitting a straight line to the data, it just means that the modelling function is linear in the unknowns (e.g. [itex]y = ax^2 + bx + c[/itex] is linear in a, b, and c). Perhaps it is established convention that Linear Least Squares does, in fact, refer to fitting a straight line, whereas

*is the more general case?***Least Squares**Lastly,

*refers to cases where the modelling function is not linear in the unknowns (e.g. [itex]y = e^{-ax^b}[/itex], where a,b are sought).***Non-linear Least Squares**Is my understanding correct on this?

Last edited: