Multivariate Linear Regression With Coefficient Constraint

  • Thread starter dansu
  • Start date
  • #1
3
0
[SOLVED] Multivariate Linear Regression With Coefficient Constraint

I'm attempting a multivariate linear regression (mvlr) by method of least squares. Basically, I'm solving a matrix of the following form for [tex]\beta_p,[/tex][tex]
$ \begin{bmatrix} \sum y \\ \sum x_1 y \\ \sum x_2 y \\ \sum x_3 y \end{bmatrix} = \begin{bmatrix} n & \sum x_1 & \sum x_2 & \sum x_3 \\ \sum x_1 & \sum x_1^2 & \sum x_1 x_2 & \sum x_1 x_3 \\ \sum x_2 & \sum x_2 x_1 & \sum x_2^2 & \sum x_2 x_3 \\ \sum x_3 & \sum x_3 x_1 & \sum x_3 x_2 & \sum x_3^2 & \end{bmatrix}\begin{bmatrix} \beta_0 \\ \beta_1 \\ \beta_2 \\ \beta_3 \end{bmatrix} $\end{text}[/tex]

x are sets of data, y is the data I want to fit, and [tex]\beta[/tex] are the coefficients.

My problem is that I want to set a constraint such that [tex]\beta[/tex] remains positive. What would be a good way to achieve this?
 

Answers and Replies

  • #2
EnumaElish
Science Advisor
Homework Helper
2,322
124
In general, you need to set up a restricted minimization problem (Lagrangian) for the sum of squared errors (SSE); then minimize SSE subject to the constraint.

E.g. the REG procedure in SAS uses RESTRICT statement, which reverts to a constrained optimization algorithm.

Specifically, if you run the unrestricted MVLR and it outputs a positive coefficient then you don't need the constraint. If the unrestricted coefficient is < 0, then you need to use constrained minimization.
 
Last edited:
  • #3
3
0
If you don't mind, could you or somebody else explain that procedure in more detail? I have only minimal experience working with lagrange multipliers. I'm unfamiliar with SAS. I've been running my numbers through Matlab.
 
  • #4
EnumaElish
Science Advisor
Homework Helper
2,322
124
In Mathematica NMinimize & NMaximize functions are used to solve nonlinear constrained global optimization problems. Perhaps you can search for similar algorithms in Matlab?
 
  • #5
EnumaElish
Science Advisor
Homework Helper
2,322
124
This might work; but I'd constrain the beta's (to zero) one at a time, and do it alternatively.

E.g. if beta1 < 0 and beta3 < 0 in the full model then first minimize (y - b0 - b1 x1 - b2 x2 - b4 x4)^2, and see whether beta1 is still < 0. Alternatively, minimize (y - b0 - b2 x2 - b3 x3 - b4 x4)^2 and see whether beta3 is still < 0. If the answer to both questions is "yes," then minimize (y - b0 - b2 x2 - b4 x4)^2. In all other cases (yes&no, no&yes, no&no), you'll have a model selection problem and you will have to use a model selection criterion (e.g. the F statistic or the adjusted R-squared).
 
Last edited:
  • #6
3
0
thank you all for your help.

I have found that the matlab function lsqnonneg does exactly what I am looking for. I'm speculating that it follows a procedure similar to what vedenev and EnumaElish are suggesting.
 

Related Threads on Multivariate Linear Regression With Coefficient Constraint

  • Last Post
Replies
2
Views
2K
Replies
5
Views
200
  • Last Post
Replies
3
Views
213
Replies
4
Views
865
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
8
Views
548
Replies
11
Views
4K
Replies
6
Views
2K
  • Last Post
Replies
13
Views
4K
Top