# Multivariate Linear Regression With Coefficient Constraint

[SOLVED] Multivariate Linear Regression With Coefficient Constraint

I'm attempting a multivariate linear regression (mvlr) by method of least squares. Basically, I'm solving a matrix of the following form for $$\beta_p,$$$$\begin{bmatrix} \sum y \\ \sum x_1 y \\ \sum x_2 y \\ \sum x_3 y \end{bmatrix} = \begin{bmatrix} n & \sum x_1 & \sum x_2 & \sum x_3 \\ \sum x_1 & \sum x_1^2 & \sum x_1 x_2 & \sum x_1 x_3 \\ \sum x_2 & \sum x_2 x_1 & \sum x_2^2 & \sum x_2 x_3 \\ \sum x_3 & \sum x_3 x_1 & \sum x_3 x_2 & \sum x_3^2 & \end{bmatrix}\begin{bmatrix} \beta_0 \\ \beta_1 \\ \beta_2 \\ \beta_3 \end{bmatrix} \end{text}$$

x are sets of data, y is the data I want to fit, and $$\beta$$ are the coefficients.

My problem is that I want to set a constraint such that $$\beta$$ remains positive. What would be a good way to achieve this?

EnumaElish
Homework Helper
In general, you need to set up a restricted minimization problem (Lagrangian) for the sum of squared errors (SSE); then minimize SSE subject to the constraint.

E.g. the REG procedure in SAS uses RESTRICT statement, which reverts to a constrained optimization algorithm.

Specifically, if you run the unrestricted MVLR and it outputs a positive coefficient then you don't need the constraint. If the unrestricted coefficient is < 0, then you need to use constrained minimization.

Last edited:
If you don't mind, could you or somebody else explain that procedure in more detail? I have only minimal experience working with lagrange multipliers. I'm unfamiliar with SAS. I've been running my numbers through Matlab.

EnumaElish
Homework Helper
In Mathematica NMinimize & NMaximize functions are used to solve nonlinear constrained global optimization problems. Perhaps you can search for similar algorithms in Matlab?

EnumaElish