How can I find the optimal vector x for a constrained optimization problem?

dilbert2011
Messages
3
Reaction score
0
Hi all,
I am working on a project and stuck at the following problem.

Find vector x_{n\times 1} which minimizes the function
Code:
[tex]f(x) = \sum_{i}^{n}x_{i}^{2}[/tex]

subject to the linear equality constraint
Code:
[tex][A]_{m\times n} x_{n \times 1}=b_{m\times 1}[/tex] with [tex]m\leq n[/tex]

The function f(x) trivially has a minimum value of 0 for x = 0. The equality constraint is an under determined problem with multiple solutions. One of the possible solutions is certain to minimize f(x). So I am relatively certain that solution to this problem exists. The additional issue is that the rank of matrix A could be less than m. So the standard Lagrange multipliers approach does not work in this case.

Any suggestions on how to approach this problem ?
 
Mathematics news on Phys.org
If the rank of A is lesss than m, then your m constraint equations are not all independent. If you convert A to a smaller set of independent constraints, you can then use Lagrange multipliers.

The "best" way to do that would probably be to figure out why the redundant constraints have got into your math model, and remove them from it. Alternatively, you can find the rank of A and a set of independent basis vectors (i.e. independent constraints) numerically.

Note, the set of independent constraints is not unique, but it doesn't matter which indepedent set you find. Any set will give you the same solution for x.
 
AlephZero,

Unfortunately, there is no escaping redundant constraints in the problem i am looking at. The approach you suggest is something I have thought of already and is certainly doable. Indeed, if the equality constraints had a full row rank, the problem would have been much easier to deal with.

But I would still like to know if this is the *only* way to do this before i go ahead and code this thing up.
 
Another way is to explicitly eliminate variables from the problem. Find the largest coefficient |a_{jk}| in A, and rewrite the j'th constraint equation in the form

x_k= a linear equation in the other variables.

Eliminate x_k from the other constraint equations and from the quadratic form you are minimizing.

Repeat till the remaining constraints have all the coefficients zero (to with rounding error) - i.e. they were redundant.

Mathematically, this is very much the same as the previous method, but it may be simpler to implement depending on the size of the problem relative to the number of constraints, and/or the sparseness of the matrix A.

A third idea which I haven't investigated, but might work: just set up the Lagrange multiplier equation system for all the constraints including the redundant ones, then solve it by doing a singular value decomposition (SVD) and ignore the singular values that are zero. That certainly works for solving unconstrained least-squares problems which are under-determined because data points are not sufficient to fix the values for all the variables in the model. See any good numerical methods text under "SVD" or "least squares" for more details.
 
Let me read up the literature on SVD and see if it can work.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Thread 'Imaginary Pythagorus'
I posted this in the Lame Math thread, but it's got me thinking. Is there any validity to this? Or is it really just a mathematical trick? Naively, I see that i2 + plus 12 does equal zero2. But does this have a meaning? I know one can treat the imaginary number line as just another axis like the reals, but does that mean this does represent a triangle in the complex plane with a hypotenuse of length zero? Ibix offered a rendering of the diagram using what I assume is matrix* notation...
Back
Top