Exploring Nonlinear Least Squares for Regression Analysis

• I
• fog37
In summary: If the data does not show a linear trend, we cannot straight use linear regression. For example, in the case of data following an exponential trend, we can take the log of the ##Y## data (leaving the ##X## data alone) and get a straight line relation between ##log(Y)## and ##X##. At this point, we can apply least squares and get the required coefficients. That is nice hack: turn the problem into a linear regression problem to find the coefficient using logs...Yes.What about other more general relationships? I am looking into "nonlinear" least squares. At high level, is it a technique to find the coefficients using a variation of least-squ
fog37
TL;DR Summary
understanding regression in general
Hello,

Regression analysis is about finding/estimating the coefficients for a particular function ##f## that would best fit the data. The function ##f## could be a straight line, an exponential, a power law, etc. The goal remains the same: finding the coefficients.

If the data does not show a linear trend, we cannot straight use linear regression. For example, in the case of data following an exponential trend, we can take the log of the ##Y## data (leaving the ##X## data alone) and get a straight line relation between ##log(Y)## and ##X##. At this point, we can apply least squares and get the required coefficients. That is nice hack: turn the problem into a linear regression problem to find the coefficient using logs...The same goes for a power law relation between ##Y## and ##X##...

A polynomial is simply an extension of the power law. I think we can apply least-squares to minimize the ##MSE## without any log transformation...Is that correct?

What about other more general relationships? I am looking into "nonlinear" least squares. At high level, is it a technique to find the coefficients using a variation of least-squares (I guess the ordinary least-squares which minimizes the MSE is called linear least-squares) without having to transform our data so it follows a linear trend?

thank for any clarification!

fog37 said:
If the data does not show a linear trend, we cannot straight use linear regression. For example, in the case of data following an exponential trend, we can take the log of the ##Y## data (leaving the ##X## data alone) and get a straight line relation between ##log(Y)## and ##X##. At this point, we can apply least squares and get the required coefficients. That is nice hack: turn the problem into a linear regression problem to find the coefficient using logs...
Yes.
fog37 said:
The same goes for a power law relation between ##Y## and ##X##...
Not sure what the power law model is.
fog37 said:
A polynomial is simply an extension of the power law. I think we can apply least-squares to minimize the ##MSE## without any log transformation...Is that correct?
If you have data, ##(y_i, x_i)## and you see that the curve ##Y = a X^2 + b## might fit, you can square your ##x_i## values and apply linear regression. You can extend this to polynomials. The "linear" part of linear regression indicates to how the coefficients appear in the model.
fog37 said:
What about other more general relationships? I am looking into "nonlinear" least squares. At high level, is it a technique to find the coefficients using a variation of least-squares (I guess the ordinary least-squares which minimizes the MSE is called linear least-squares) without having to transform our data so it follows a linear trend?
No. It works as long as the coefficients appear in the model in the appropriate way.

fog37
Thank you.

As far as linear regression goes, what happens if the scatter plot of ##Y## vs ##X## shows a good linear trend/association but the required model assumptions (residuals are not uncorrelated and have equal variance in the graph residuals vs ##X##) are not satisfied? Is that even possible? Or will we not see a linear trend in the scatterplot if the assumptions are not met?

Thanks again

fog37 said:
Thank you.

As far as linear regression goes, what happens if the scatter plot of ##Y## vs ##X## shows a good linear trend/association but the required model assumptions (residuals are not uncorrelated
Correlated residuals sound like it is a time series. Is that what you mean?
fog37 said:
and have equal variance in the graph residuals vs ##X##)
Do you mean that the variance might be proportional to the ##Y## magnitude? That would imply a model like ##Y = \epsilon X##. I think you should try taking logarithms of both sides: ##\log Y = \log X + \epsilon_l##.
But there are a million similar things that might come up, so it is best to wait until you have a specific case and ask about that.

In general you want to difference data that exhibits power law characteristics until the series is homoskedastic (same variance). So, for example, if your rh variable is human height, you can leave it alone, but if it is wealth or market cap of a stock, use logs so the variance does not scale with the value of the rh variable

• Set Theory, Logic, Probability, Statistics
Replies
7
Views
779
• Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
30
Views
3K
• Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
2
Views
751
• Set Theory, Logic, Probability, Statistics
Replies
22
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
859