Multivariate Linear Regression With Coefficient Constraint

AI Thread Summary
The discussion focuses on implementing multivariate linear regression (MVLR) with a constraint that coefficients remain positive. The user is solving a least squares problem and seeks methods for constrained minimization, particularly using Lagrangian techniques. They mention the REG procedure in SAS for constrained optimization and express a preference for MATLAB, discovering that the lsqnonneg function meets their needs. The conversation also highlights the importance of checking coefficients after running unrestricted models to determine if constraints are necessary. Overall, the thread provides insights into techniques for achieving positive coefficients in MVLR.
dansu
Messages
3
Reaction score
0
[SOLVED] Multivariate Linear Regression With Coefficient Constraint

I'm attempting a multivariate linear regression (mvlr) by method of least squares. Basically, I'm solving a matrix of the following form for \beta_p,<br /> $ \begin{bmatrix} \sum y \\ \sum x_1 y \\ \sum x_2 y \\ \sum x_3 y \end{bmatrix} = \begin{bmatrix} n &amp; \sum x_1 &amp; \sum x_2 &amp; \sum x_3 \\ \sum x_1 &amp; \sum x_1^2 &amp; \sum x_1 x_2 &amp; \sum x_1 x_3 \\ \sum x_2 &amp; \sum x_2 x_1 &amp; \sum x_2^2 &amp; \sum x_2 x_3 \\ \sum x_3 &amp; \sum x_3 x_1 &amp; \sum x_3 x_2 &amp; \sum x_3^2 &amp; \end{bmatrix}\begin{bmatrix} \beta_0 \\ \beta_1 \\ \beta_2 \\ \beta_3 \end{bmatrix} $\end{text}

x are sets of data, y is the data I want to fit, and \beta are the coefficients.

My problem is that I want to set a constraint such that \beta remains positive. What would be a good way to achieve this?
 
Physics news on Phys.org
In general, you need to set up a restricted minimization problem (Lagrangian) for the sum of squared errors (SSE); then minimize SSE subject to the constraint.

E.g. the REG procedure in SAS uses RESTRICT statement, which reverts to a constrained optimization algorithm.

Specifically, if you run the unrestricted MVLR and it outputs a positive coefficient then you don't need the constraint. If the unrestricted coefficient is < 0, then you need to use constrained minimization.
 
Last edited:
If you don't mind, could you or somebody else explain that procedure in more detail? I have only minimal experience working with lagrange multipliers. I'm unfamiliar with SAS. I've been running my numbers through Matlab.
 
In Mathematica NMinimize & NMaximize functions are used to solve nonlinear constrained global optimization problems. Perhaps you can search for similar algorithms in Matlab?
 
This might work; but I'd constrain the beta's (to zero) one at a time, and do it alternatively.

E.g. if beta1 < 0 and beta3 < 0 in the full model then first minimize (y - b0 - b1 x1 - b2 x2 - b4 x4)^2, and see whether beta1 is still < 0. Alternatively, minimize (y - b0 - b2 x2 - b3 x3 - b4 x4)^2 and see whether beta3 is still < 0. If the answer to both questions is "yes," then minimize (y - b0 - b2 x2 - b4 x4)^2. In all other cases (yes&no, no&yes, no&no), you'll have a model selection problem and you will have to use a model selection criterion (e.g. the F statistic or the adjusted R-squared).
 
Last edited:
thank you all for your help.

I have found that the MATLAB function lsqnonneg does exactly what I am looking for. I'm speculating that it follows a procedure similar to what vedenev and EnumaElish are suggesting.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top