Optimization w/ Constraint Question (Multivariable Calculus)

In summary: Okay, so what I'm getting is (0,1), (0,-1), (1,0), (-1,0). Plugging these points back in, the maximia is 2 and the minima is 1. I don't think ##\lambda = 1## is important, since it seems that when I do that,...In summary, the student attempted to solve the homework statement but was not able to get the x and y values out of the equation. He then tried using Lagrange multipliers, but found that this was not necessary. He then tried another method, and found that the maxima and minima were 2 and 1, respectively.
  • #1
danielhep
14
1

Homework Statement


Find any maxima/minima on f(x,y) = x2+2y2 on the unit circle, centered at the origin.

Homework Equations


grad f = λgrad g
constraint: 1=x2+y2

The Attempt at a Solution


grad f = 2xi+4yj
grad g = 2xi+2yj
2x=λ2x
2y=λ4y

How do I solve this? I don't see any way to get numbers for x and y out of this. I'm really not sure how to go on with this problem.
 
Physics news on Phys.org
  • #2
danielhep said:

Homework Statement


Find any maxima/minima on f(x,y) = x2+2y2 on the unit circle, centered at the origin.

Homework Equations


grad f = λgrad g
constraint: 1=x2+y2

The Attempt at a Solution


grad f = 2xi+4yj
grad g = 2xi+2yj
2x=λ2x
2y=λ4y

How do I solve this? I don't see any way to get numbers for x and y out of this. I'm really not sure how to go on with this problem.

(1) The first equation can be written as ##2x(\lambda-1) = 0.## What does this tell you?
(2) Do you really need to use Lagrange multipliers for this problem?
 
  • #3
Ray Vickson said:
(1) The first equation can be written as ##2x(\lambda-1) = 0.## What does this tell you?
(2) Do you really need to use Lagrange multipliers for this problem?
1) x=0. Then I can plug that back into g(x,y) and get (0,1). Then do the same thing for y and get (1,0). Then I can check those and find which is max and min? Is that correct? Does the 2 coefficient on 2y2 just go away completely?
2) What would be a better way? I just am following the steps I was taught.
 
  • #4
danielhep said:
1) x=0. Then I can plug that back into g(x,y) and get (0,1). Then do the same thing for y and get (1,0). Then I can check those and find which is max and min? Is that correct? Does the 2 coefficient on 2y2 just go away completely?
2) What would be a better way? I just am following the steps I was taught.

(1) means "either ##x=0## or ##\lambda = 1## (or both)". You have dealt with the ##x=0## case. What about the ##\lambda = 1## case?

Note: ##x = 0## does not give you just ##y = 1##; it can also give you ##y = -1##, because ##y^2 = 1## has two roots.

In the present case, simply evaluating ##f(x,y) = x^2+2 y^2## at the various solution points will tell you which ones are maxima and which are minima. The reason s that the objective ##f## is continuous and the constraint set ##g(x,y) = x^2 + y^2 - 1 = 0## is compact---that is, closed and bounded---so there must be a max and a min of ##f##. The Lagrangian conditions are necessarily satisfied at the optima, so the points you get are the only possible candidates.

In constrained optimization the second order tests for a max or a min are more involved than in the unconstrained case. A second-order necessary condition for a (local) minimum is that the Hessian of the Lagrangian (not just of ##f## alone!) must be positive semi-definite in the projection onto the tangent subspace at the solution point. A sufficient condition for a strict local minimum is that the projected Hessian of the Lagrangian be positive definite in the tangent subspace of the constraint.

(2) I'll let you think about alternative methods for a while. Look again at the problem!
 
  • #5
Ray Vickson said:
(1) means "either ##x=0## or ##\lambda = 1## (or both)". You have dealt with the ##x=0## case. What about the ##\lambda = 1## case?

Note: ##x = 0## does not give you just ##y = 1##; it can also give you ##y = -1##, because ##y^2 = 1## has two roots.

In the present case, simply evaluating ##f(x,y) = x^2+2 y^2## at the various solution points will tell you which ones are maxima and which are minima. The reason s that the objective ##f## is continuous and the constraint set ##g(x,y) = x^2 + y^2 - 1 = 0## is compact---that is, closed and bounded---so there must be a max and a min of ##f##. The Lagrangian conditions are necessarily satisfied at the optima, so the points you get are the only possible candidates.

In constrained optimization the second order tests for a max or a min are more involved than in the unconstrained case. A second-order necessary condition for a (local) minimum is that the Hessian of the Lagrangian (not just of ##f## alone!) must be positive semi-definite in the projection onto the tangent subspace at the solution point. A sufficient condition for a strict local minimum is that the projected Hessian of the Lagrangian be positive definite in the tangent subspace of the constraint.

(2) I'll let you think about alternative methods for a while. Look again at the problem!
1) Okay, so what I'm getting is (0,1), (0,-1), (1,0), (-1,0). Plugging these points back in, the maximia is 2 and the minima is 1.
I don't think ##\lambda = 1## is important, since it seems that when I do that, it implies x can be whatever and y must be zero, which gets me points that I've already found.

2) I'm not sure that I know of any other strategy that can be used for solving constrained optimization problems, but I suppose just by looking at this you can sort of intuit that (0,1) and (0,-1) will be the maximums.
 
  • #6
danielhep said:
1) Okay, so what I'm getting is (0,1), (0,-1), (1,0), (-1,0). Plugging these points back in, the maximia is 2 and the minima is 1.
I don't think ##\lambda = 1## is important, since it seems that when I do that, it implies x can be whatever and y must be zero, which gets me points that I've already found.

2) I'm not sure that I know of any other strategy that can be used for solving constrained optimization problems, but I suppose just by looking at this you can sort of intuit that (0,1) and (0,-1) will be the maximums.

(1) Assuming ##\lambda = 1##, the second condition ##2y = 4 \lambda y## gives ##2y = 4y##, so ##y = 0##. The constraint then implies ##x = \pm 1##.

Sometimes, knowing ##\lambda## is almost as important as knowing ##x## and ##y##, so knowing ##\lambda = 1## may be needed for certain types of ongoing analysis. Maybe at this stage you do not yet see its importance, but be assured, it is important.

(2) There are two methods for dealing with equality-constrained optimization: (a) the Lagrange multiplier method; and (b) elimination of variables via the constraint(s).

The normally-preferred method is (a); it is the basis of numerous effective numerical optimization codes, in which various steps towards improved solutions often involve not only the current estimates of the variables ##x,y,\ldots## but also the current estimates of the Lagrange multipliers. Doing things this way can speed up the accuracy and reliability of algorithms by orders of magnitude. So, Lagrange multipliers are of huge importance in the field (despite what some may try to tell you).

However, in certain, special cases, (b) may be easier and faster. In your problem, you can see that feasible ##(x,y)## satisfy ##-1 \leq y \leq 1##, and for any ##y## in that interval we must have ##x^2 = 1 - y^2##. Therefore, on the constraint set we have ##f(x,y) = x^2 + 2 y^2 = 1-y^2 + 2 y^2 = 1+y^2##. The problem becomes one-dimensional: ##\max / \min (1+y^2), \; -1 \leq y \leq 1##. Obviously, the (global) min is at ##y = 0## (giving ##x = \pm 1##) while the global maxima are at ##y = \pm 1## (giving ##x = 0##).

Usually method (a) is preferred over method (b), but sometimes for problems having special structure a method like (b) is easier.
 
Last edited:
  • #7
Ray Vickson said:
(1) Assuming ##\lambda = 1##, the second condition ##2y = 4 \lambda y## gives ##2y = 4y##, so ##y = 0##. The constraint then implies ##x = \pm 1##.

Sometimes, knowing ##\lambda## is almost as important as knowing ##x## and ##y##, so knowing ##\lambda = 1## may be needed for certain types of ongoing analysis. Maybe at this stage you do not yet see its importance, but be assured, it is important.

(2) There are two methods for dealing with equality-constrained optimization: (a) the Lagrange multiplier method; and (b) elimination of variables via the constraint(s).

The normally-preferred method is (a); it is the basis of numerous effective numerical optimization codes, in which various steps towards improved solutions often involve not only the current estimates of the variables ##x,y,\ldots## but also the current estimates of the Lagrange multipliers. Doing things this way can speed up the accuracy and reliability of algorithms by orders of magnitude. So, Lagrange multipliers are of huge importance in the field (despite what some may try to tell you).

However, in certain, special cases, (b) may be easier and faster. In your problem, you can see that feasible ##(x,y)## satisfy ##-1 \leq y \leq 1##, and for any ##y## in that interval we must have ##x^2 = 1 - y^2##. Therefore, on the constraint set we have ##f(x,y) = x^2 + 2 y^2 = 1-y^2 + 2 y^2 = 1+y^2##. The problem becomes one-dimensional: ##\max / \min (1+y^2), \; -1 \leq y \leq 1##. Obviously, the (global) min is at ##y = 0## (giving ##x = \pm 1##) while the global maxima are at ##y = \pm 1## (giving ##x = 0##).

Usually method (a) is preferred over method (b), but sometimes for problems having special structure a method like (b) is easier.
Thank you for your help! I think I just needed to get by those barriers and now I understand these problems better.

I'll keep strategy b in mind, but I think I might stick to the slightly harder way for now just to make sure I get it.
 
  • #8
danielhep said:
Thank you for your help! I think I just needed to get by those barriers and now I understand these problems better.

I'll keep strategy b in mind, but I think I might stick to the slightly harder way for now just to make sure I get it.

I hope I did not give you the wrong impression. 99% of the time the Lagrange multiplier method is easier than the elimination approach. It is just in quite rare cases that the elimination method is easier, such as in your current problem.
 

1. What is optimization with constraint?

Optimization with constraint is a mathematical process used to find the maximum or minimum value of a multivariable function while taking into account certain constraints or limitations. It involves using techniques from multivariable calculus, such as partial derivatives and Lagrange multipliers, to find the optimal solution.

2. Why is optimization with constraint important?

Optimization with constraint is important because it allows us to find the most efficient or optimal solution to a problem while taking into account any limitations or restrictions. It has many real-world applications, such as in engineering, economics, and computer science.

3. What are some common constraints in optimization problems?

Some common constraints in optimization problems include budget constraints, production constraints, time constraints, and resource constraints. These constraints can be represented by equations or inequalities that limit the possible solutions to the optimization problem.

4. How do you solve an optimization problem with constraints?

To solve an optimization problem with constraints, you first need to set up the problem by identifying the objective function and the constraints. Then, you can use techniques from multivariable calculus, such as taking partial derivatives and using Lagrange multipliers, to find the optimal solution. This often involves setting up a system of equations and solving for the variables.

5. What are some challenges of optimization with constraint?

One of the main challenges of optimization with constraint is finding the optimal solution within the constraints. This can be a difficult and time-consuming process, especially for complex problems. Another challenge is determining the appropriate constraints to use, as they can greatly affect the outcome of the optimization problem.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
542
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
460
  • Calculus and Beyond Homework Help
Replies
16
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
5K
Back
Top