Undergrad Isolate variables in nonlinear equation for regression

Click For Summary
The discussion revolves around calibrating a nonlinear equation that relates tidal parameters to river discharge using Matlab's nlinfit function. The equation includes a complex term that complicates the regression due to the tidal range variable appearing on the right-hand side. Suggestions include using total least squares regression instead of ordinary regression to account for measurement errors across all variables. It is noted that if the dependent variable spans the real line, linear regression may be applicable, provided the contributing variables are independent. Accurate estimates require consideration of correlation and covariance among the variables to avoid underestimating standard errors.
edge333
Messages
14
Reaction score
0
Hi all,

I have a nonlinear equation of the form:

<br /> \frac{TP_x}{TP_R} = c_0 + c_1 U_R^n + c_2 \frac{T_R^2}{\sqrt{U_R}}<br />

This equation describes the relationship between tidal parameters and river discharge (velocity) in tidal rivers derived from the 1-D St. Venant equations. TPx is some tidal property at station x along the river, TPR is the same tidal property at a coastal reference station, UR is the river flow (velocity), and TR is the tidal range at the reference station.

What I am trying to do is calibrate this model using contemporary data where TPx, TPR, TR, and UR are known. Therefore, I am trying to determine the coefficients (c0, c1, c2, and n) of this equation using Matlab's nlinfit function. The problem is that the last term in the equation complicates the regression because the TR variable is on the right-hand side. Is there a way separating of the independent and dependent variables for regression? Consider any combination of TPR, TPx, and TR to be the independent variable. I should also mention that 0.5 < n < 1.5.

thanks
 
Last edited:
I don't use Matlab, but after some web browsing, I suggest you consider doing total least squares regression (http://www.mathworks.com/matlabcentral/fileexchange/31109-total-least-squares-method) instead of ordinary regression.

In a least square regression fit for Y = F(x1,x2,..) to data, it is assumed that x1,x2,... are measured without "error" and that the object is to find a curve that fits noisy data for Y. So if you one of your variables ##TP_X,TP_R,T_R,U_R## is measured very precisely and free of any conceptual random variations, you could us it for Y. However, if all your variables are on an equal footing as far as measurement errors or random fluctuations go, then total least squares regression would be a better choice.
 
Hey edge333.

Building on the post from Stephen Tashi - if the dependent variable spans the real line then linear regression can be used. If the variables contributing to the dependent variable are independent then it makes the whole process easier.

If not you will need to use correlation/covariance information to get accurate estimates (basically the standard errors will be under-estimated if things are correlated in some manner).
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...