Nonlinear regression in two or more independent variables

Click For Summary

Discussion Overview

The discussion revolves around the application of nonlinear regression techniques when dealing with two or more independent variables. Participants explore numerical methods, particularly focusing on extending established techniques like Levenberg-Marquardt and Gauss-Newton for multiple independent variables.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant inquires about established numerical methods for nonlinear regression with multiple independent variables, noting familiarity with methods for single independent variables.
  • Another participant suggests broadening the search to include non-linear least squares minimization.
  • There is a discussion about the representation of independent variables as vectors and how to consolidate them into a single vector for mathematical operations.
  • A participant expresses difficulty in understanding the notation and how to incorporate multiple independent variables into the regression model.
  • One participant explains the general form of the model and the minimization process, encouraging further elaboration on the specific case being addressed.
  • A participant describes their goal of fitting parameters into a nonlinear model and seeks a robust numerical method for parameter estimation.
  • Another participant shares a breakthrough realization about treating data points as constants during the optimization process, leading to a successful implementation of the algorithm.

Areas of Agreement / Disagreement

The discussion contains multiple viewpoints and approaches regarding the application of nonlinear regression methods for multiple independent variables. No consensus is reached on a singular method or solution, and participants express varying levels of understanding and clarity on the topic.

Contextual Notes

Participants mention challenges related to notation and the complexity of incorporating multiple independent variables into regression models. There is also uncertainty regarding the calculation of estimated variances and the robustness of different numerical methods.

maistral
Messages
235
Reaction score
17
Hi. I wanted to learn more on this topic, but it seems all the available resources in the internet points to using R, SPSS, MINITAB or EXCEL.

Is there an established numerical method for such cases? I am aware of the Levenberg-Marquardt, Gauss-Newton and such methods for nonlinear regression on one independent variable only. I would like to ask how can I extend these methods for use on two or more independent variables.
 
Physics news on Phys.org
Hi,

I suggest you widen your search to include non-linear least squares minimization. Plenty stuff there.
 
Hi. Thanks for replying.

I did search for nonlinear LSM as well, but I keep on getting information only on single independent variables.
 
? In the link I gave ##\bf x## is a vector -- or is that not what you are referring to ?
 
Errr. If I understand this paper correctly x is a vector containing the values for the independent variable. My problem refers to a case where there are two independent variables (and thus, two vectors).

Unless I'm not reading this correctly, if this is the case I extend my apologies.
 
No need for apologies.
Two independent vectors with N and M components, respectively, make one vector with N+M components ...
 
  • Like
Likes   Reactions: FactChecker
Yes. In this context, all the independent variables are usually consolidated into a single vector, X. That makes the math notation much more concise in terms of vector operations.
 
This is making me feel sad. I am totally unable to comprehend the notation. I have no idea how to insert the second set of independent variables.

As how I understood it, it should look like a matrix of N rows (data) x M columns (independent variables). The Jacobian should be simple I guess as it's simply going to be the partial derivative of the function with respect each modelling parameter evaluated at the independent variables.

Then I go to a dead end. I have no idea how to proceed further. I mean, I can evaluate JTJ and such; but regarding the independent variables, do I just move forward with the matrix operations?
 
maistral said:
comprehend the notation. I have no idea how to insert the second set of independent variables.
I have a feeling it's not the notation that's the barrier here. You have some experiments where you vary something (the independent variables ##\vec x## ) and you measure something (the dependent variables ##\vec y##). Holding those separate is already a major task in many cases... :rolleyes:.

And you have some idea in the form of a model that expresses the ##y_i## as a function of the ##x_j\, : \ \ \ \hat y_i = f_i (\vec x, \vec p) . \ \ ## Here##\ \hat y_i\ ## is the expected (predicted) value of the actual measurement ##\ y_i \, .\ \ ## And ##\ \vec p\ ## is a vector of parameters, the things you are investigating after.

What you want to minimize is ##\displaystyle {\sum {(y_i - \hat y_i)^2\over \sigma_i^2} } ## by suitably varying the ##p_k .\ \ ## The ##\sigma_i## are the (estimated) standard deviations in the ##\ y_i##.

This is a bit general; perhaps we can make this easier if you elaborate a bit more on what exactly you are doing, or by focusing on a comparable example ?
 
  • #10
Hi! Thank you very much for replying.

Yes, I'm actually looking for a general case. What I'm trying to do is; say, I have this set of data:
2hrle1u.png


I wanted to fit the modelling parameters a, b, and z into Z(x,y). While this can be transformed into a multiple linear regression problem, I wanted to be able to do it using nonlinear regression. That's why I was asking if there is a numerical method tailored for these kinds of problems, or is there a way to extend, say for example, at the very least the Gauss-Newton method.

EDIT: I think I understand the strategy you mentioned. Using guess values of the parameters a, b, and z, I'll evaluate the modelling function at those initial guess values and the points x, y. Then I'll get the sum of the squares of the residuals (something like (Zmeasured - Zpredicted)2). Then the problem becomes a minimization problem which I can kill using something similar to the Newton-Raphson for optimization or steepest descent methods. I do not remember using that estimated variance though (I don't even know how to calculate that)...

My question was if there is a more 'robust' or algorithmic method in order to determine the parameters; something similar to the Levenberg-Marquardt.
 
  • #11
Hold on. I'm squealing here because of a tiny eureka moment. I'll try studying this on my own and see what happens.
 
  • #12
Aha. I made it work.

Basically I just need to run the algorithm normally since the partial derivatives are with respect to the modelling parameters anyway and the method would just treat the data points as constants. I had this idea since your post reminded me of that generic method of attempting to reduce the sum of the squares of residuals.

Moral lesson: do not over-analyze, lol.
Thanks again!
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
Replies
3
Views
3K
  • · Replies 25 ·
Replies
25
Views
9K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K