What is the optimal solution for constrained nonlinear programming problems?

squenshl
Messages
468
Reaction score
4

Homework Statement


1. Maximise ##x_1^2+(x_2-5)^2## subject to ##x_1 \geq 0, x_2 \geq 0## and ##2x_1+x_2 \leq 4##.
2. Minimise ##x_1^2+(x_2-5)^2## subject to ##x_1 \geq 0, x_2 \geq 0## and ##2x_1+x_2 \leq 4##.
3. Maximise ##2x_2^2-x_1## subject to ##x_1 \geq 0, x_2 \geq 0## and ##x_1^2+x_2^2 \leq 1##.

Homework Equations

The Attempt at a Solution


1. After doing the process of Lagrange multipliers I get ##x_1 = \lambda## and ##x_2 = \frac{\lambda+10}{2}## which in turn gives ##\lambda = -\frac{2}{5}## meaning ##x_1 = -\frac{2}{5}## and ##x_2 = \frac{24}{5}##. But since ##x_1, \lambda < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.
2. After doing the process of Lagrange multipliers I get ##x_1 = -\lambda## and ##x_2 = \frac{10-\lambda}{2}## which in turn gives ##\lambda = \frac{2}{5}## meaning ##x_1 = -\frac{2}{5}## and ##x_2 = \frac{24}{5}##. But since ##x_1, \lambda < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.
3. After doing the process of Lagrange multipliers I get ##x_1 = -\frac{1}{2\lambda}## and ##x_2 = 0## which in turn gives ##\lambda = 2## meaning ##x_1 = -\frac{1}{4}##. But since ##x_1 < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.
 
Physics news on Phys.org
squenshl said:

Homework Statement


1. Maximise ##x_1^2+(x_2-5)^2## subject to ##x_1 \geq 0, x_2 \geq 0## and ##2x_1+x_2 \leq 4##.
2. Minimise ##x_1^2+(x_2-5)^2## subject to ##x_1 \geq 0, x_2 \geq 0## and ##2x_1+x_2 \leq 4##.
3. Maximise ##2x_2^2-x_1## subject to ##x_1 \geq 0, x_2 \geq 0## and ##x_1^2+x_2^2 \leq 1##.

Homework Equations

The Attempt at a Solution


1. After doing the process of Lagrange multipliers I get ##x_1 = \lambda## and ##x_2 = \frac{\lambda+10}{2}## which in turn gives ##\lambda = -\frac{2}{5}## meaning ##x_1 = -\frac{2}{5}## and ##x_2 = \frac{24}{5}##. But since ##x_1, \lambda < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.
2. After doing the process of Lagrange multipliers I get ##x_1 = -\lambda## and ##x_2 = \frac{10-\lambda}{2}## which in turn gives ##\lambda = \frac{2}{5}## meaning ##x_1 = -\frac{2}{5}## and ##x_2 = \frac{24}{5}##. But since ##x_1, \lambda < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.
3. After doing the process of Lagrange multipliers I get ##x_1 = -\frac{1}{2\lambda}## and ##x_2 = 0## which in turn gives ##\lambda = 2## meaning ##x_1 = -\frac{1}{4}##. But since ##x_1 < 0## which don't satisfy ##x_1 \geq 0## and Kuhn-Tucker conditions I'm not sure what to do next.

In (1), you may have ##x_1 = 0## at the optimal solution; the only way to be sure is to try it out and see if you satisfy all the Kuhn-Tucker conditions. If you write the Lagrangian as
L = x_1^2 + (x_2 - 5)^2 + \lambda (4 - 2 x_1 - x_2)
then you need
\begin{array}{l} \partial L /\partial x_1 \leq 0 , \: x_1 \geq 0 \; \text{and at least one of these } \: = 0\\<br /> \partial L /\partial x_2 \leq 0 , \: x_2 \geq 0 \; \text{and at least one of these } \: = 0\\<br /> \lambda \geq 0 , \; 2x_1 + x_2 \leq 4 , \; \text{and at least one of these is an equality}<br /> \end{array}<br />
Can you satisfy these conditions when you assume ##x_1 = 0## and ##x_2 > 0##? If not, guess again and start over.

Similarly for the other cases that are giving you trouble.

Note: inequality-constrained nonlinear optimization problems are inherently much harder than equality-constrained ones. Often it is necessary to make several "guesses" about the nature of the optimum (variable = 0 or derivative = 0?, multiplier = 0 or constraint is tight?). Many NLP solvers solve a controlled sequence of equality-constrained problems (at least to sufficient accuracy to know correct signs of multipliers and variables, not necessarily to full accuracy), and then change one or more of the binding constraints, to see if the new solution works out, etc.
 
Last edited:
Great thanks.

So if ##x_1 = 0##, then ##\lambda = 0## because ##x_1 = \lambda##. So ##x_2 = 5##. This satisfies the conditions above but doesn't satisfy the other Kuhn-Tucker condition of ##\frac{\partial L}{\partial \lambda} \geq 0## as ##\frac{\partial L}{\partial \lambda} = 4-0-5 = -1.##
 
squenshl said:
Great thanks.

So if ##x_1 = 0##, then ##\lambda = 0## because ##x_1 = \lambda##. So ##x_2 = 5##. This satisfies the conditions above but doesn't satisfy the other Kuhn-Tucker condition of ##\frac{\partial L}{\partial \lambda} \geq 0## as ##\frac{\partial L}{\partial \lambda} = 4-0-5 = -1.##

You cannot assume ##\lambda = x_1## because you cannot assume ##\partial L/ \partial x_1 = 0##. If you do assume ##x_2 > 0## then you have two conditions to solve: ##\partial L / \partial x_2 = 0## and ##2 x_1 + x_2 = 4##, to find the "unknowns" ##x_2, \lambda##. You will now have a candidate set of values for ##x_1 (= 0), x_2## and ##\lambda##, so you can check if all the KT conditions work out for that candidate set.

Anyway, I edited my previous post, to alter/expand on the suggestion. You should re-read it.
 
Last edited:
Great thanks alot!
P.S. I just saw your edited post.

For ##x_1 = 0##, we get ##x_2 \leq 4## but all of these values result in a negative ##\lambda## violating the KT conditions
 
Last edited:
squenshl said:
Great thanks alot!
P.S. I just saw your edited post.

For ##x_1 = 0##, we get ##x_2 \leq 4## but all of these values result in a negative ##\lambda## violating the KT conditions

So, try again! Like I said, these problems can require several guesses.

However, sometimes it makes sense to think about what the problem means before you even start to try solving it. Can you think of a simple geometric interpretation of the problem?
 
Last edited:
Great.

The answer should be ##x_1 = 2##, ##x_2 = 0## and ##\lambda = 2##.
 
Back
Top