Optimization w/ Constraint Question (Multivariable Calculus)

1. May 26, 2017

danielhep

1. The problem statement, all variables and given/known data
Find any maxima/minima on f(x,y) = x2+2y2 on the unit circle, centered at the origin.

2. Relevant equations
constraint: 1=x2+y2

3. The attempt at a solution
grad f = 2xi+4yj
grad g = 2xi+2yj
2x=λ2x
2y=λ4y

How do I solve this? I don't see any way to get numbers for x and y out of this. I'm really not sure how to go on with this problem.

2. May 26, 2017

Ray Vickson

(1) The first equation can be written as $2x(\lambda-1) = 0.$ What does this tell you?
(2) Do you really need to use Lagrange multipliers for this problem?

3. May 26, 2017

danielhep

1) x=0. Then I can plug that back into g(x,y) and get (0,1). Then do the same thing for y and get (1,0). Then I can check those and find which is max and min? Is that correct? Does the 2 coefficient on 2y2 just go away completely?
2) What would be a better way? I just am following the steps I was taught.

4. May 26, 2017

Ray Vickson

(1) means "either $x=0$ or $\lambda = 1$ (or both)". You have dealt with the $x=0$ case. What about the $\lambda = 1$ case?

Note: $x = 0$ does not give you just $y = 1$; it can also give you $y = -1$, because $y^2 = 1$ has two roots.

In the present case, simply evaluating $f(x,y) = x^2+2 y^2$ at the various solution points will tell you which ones are maxima and which are minima. The reason s that the objective $f$ is continuous and the constraint set $g(x,y) = x^2 + y^2 - 1 = 0$ is compact---that is, closed and bounded---so there must be a max and a min of $f$. The Lagrangian conditions are necessarily satisfied at the optima, so the points you get are the only possible candidates.

In constrained optimization the second order tests for a max or a min are more involved than in the unconstrained case. A second-order necessary condition for a (local) minimum is that the Hessian of the Lagrangian (not just of $f$ alone!) must be positive semi-definite in the projection onto the tangent subspace at the solution point. A sufficient condition for a strict local minimum is that the projected Hessian of the Lagrangian be positive definite in the tangent subspace of the constraint.

(2) I'll let you think about alternative methods for a while. Look again at the problem!

5. May 26, 2017

danielhep

1) Okay, so what I'm getting is (0,1), (0,-1), (1,0), (-1,0). Plugging these points back in, the maximia is 2 and the minima is 1.
I don't think $\lambda = 1$ is important, since it seems that when I do that, it implies x can be whatever and y must be zero, which gets me points that I've already found.

2) I'm not sure that I know of any other strategy that can be used for solving constrained optimization problems, but I suppose just by looking at this you can sort of intuit that (0,1) and (0,-1) will be the maximums.

6. May 26, 2017

Ray Vickson

(1) Assuming $\lambda = 1$, the second condition $2y = 4 \lambda y$ gives $2y = 4y$, so $y = 0$. The constraint then implies $x = \pm 1$.

Sometimes, knowing $\lambda$ is almost as important as knowing $x$ and $y$, so knowing $\lambda = 1$ may be needed for certain types of ongoing analysis. Maybe at this stage you do not yet see its importance, but be assured, it is important.

(2) There are two methods for dealing with equality-constrained optimization: (a) the Lagrange multiplier method; and (b) elimination of variables via the constraint(s).

The normally-preferred method is (a); it is the basis of numerous effective numerical optimization codes, in which various steps towards improved solutions often involve not only the current estimates of the variables $x,y,\ldots$ but also the current estimates of the Lagrange multipliers. Doing things this way can speed up the accuracy and reliability of algorithms by orders of magnitude. So, Lagrange multipliers are of huge importance in the field (despite what some may try to tell you).

However, in certain, special cases, (b) may be easier and faster. In your problem, you can see that feasible $(x,y)$ satisfy $-1 \leq y \leq 1$, and for any $y$ in that interval we must have $x^2 = 1 - y^2$. Therefore, on the constraint set we have $f(x,y) = x^2 + 2 y^2 = 1-y^2 + 2 y^2 = 1+y^2$. The problem becomes one-dimensional: $\max / \min (1+y^2), \; -1 \leq y \leq 1$. Obviously, the (global) min is at $y = 0$ (giving $x = \pm 1$) while the global maxima are at $y = \pm 1$ (giving $x = 0$).

Usually method (a) is preferred over method (b), but sometimes for problems having special structure a method like (b) is easier.

Last edited: May 26, 2017
7. May 26, 2017

danielhep

Thank you for your help! I think I just needed to get by those barriers and now I understand these problems better.

I'll keep strategy b in mind, but I think I might stick to the slightly harder way for now just to make sure I get it.

8. May 26, 2017

Ray Vickson

I hope I did not give you the wrong impression. 99% of the time the Lagrange multiplier method is easier than the elimination approach. It is just in quite rare cases that the elimination method is easier, such as in your current problem.