Optimization problem with constraint

Click For Summary
SUMMARY

The optimization problem involves determining the existence of maximum and minimum values for the function f(x,y) = x^2 + y^2 under the constraint 2x^3 + 3x^2y + 3xy^2 + 2y^3 = 1. The Lagrangian multiplier method indicates that the only real solution is x = y = 1/10^(1/3), which is a minimum. The analysis confirms that there is no maximum due to the unbounded nature of the constraint set, as demonstrated through polar coordinates and the uniqueness of the solution derived from the Groebner basis.

PREREQUISITES
  • Understanding of Lagrangian multipliers in optimization
  • Knowledge of polynomial inequalities and their implications
  • Familiarity with Groebner basis for solving systems of equations
  • Basic concepts of compactness in topology
NEXT STEPS
  • Study the application of Lagrangian multipliers in constrained optimization problems
  • Explore the use of Groebner bases in computational algebra
  • Learn about the geometric interpretation of constraints in optimization
  • Investigate the properties of compact sets in mathematical analysis
USEFUL FOR

Mathematicians, optimization specialists, and students studying advanced calculus or mathematical analysis who are interested in constrained optimization techniques and their applications.

dobedobedo
Messages
28
Reaction score
0
PROBLEM STATEMENT:
Determine if f(x,y) = x^2+y^2 has a maximum and a minimum when we have the constraint 2x^3+3x^{2}y+3xy^{2}+2y^3=1. (1)

ATTEMPT TO SOLUTION:
A standard way of solving these kinds of problems is by using the Lagrangian multiplier-method. It consists of comparing the gradient of the function with the gradient of the constraint. However, sometimes, as in the case of with the constraint , one can rapidly observe that it lacks a maximum value. This is the case of the function f(x,y) = x^{2}y with the constraint x+y=1. As x approaches infinity, on that line x+y=1, the function approaches infinity. Is there any similar way to observe problem (1)? How do I prove that it has or hasn't a maximum?

I've used the Lagrangian multiplier method, and I get the solution x=\frac{1}{10^{1/3}} = |y|. I have not examined that it is correct, and I have not checked if it is a max or min.

How do I prove that this function has or has not a max or min?

[EDIT1:]
Okay. So we can interpret the question geometrically as: what is the smallest or largest distance from the origin to that curve? And: does there exist a smallest or largest distance from the origin to that curve? Hm... I can't visualize curve (1). If it is "closed", like an ellipse or circle, then there certainly has to exist a smallest and largest distance. But if it is like a line - which extends itself infinitely - there is a smallest distance, but not a largest. So the question is: how do I know the expression (1) is for a curve of finite length?

[EDIT2:]
Okay guise. The following inequalities are true:
(x^3-1)^2 = x^6+1-2x^3 \geq 0
\Rightarrow x^6+1 \geq 2x^3

(x^2-y) = x^4+y^2-2x^{2}y \geq 0
\Rightarrow \frac{3}{2} (x^4+y^2)\geq 3x^{2}y

Add a little symmetry-reasoning, and we get an inequality which can be used to study the constraint (1). We get:

2+x^6+y^6+\frac{3}{2}(x^4+y^4+x^2+y^2) \geq 2(x^3+y^3)+3(x^{2}y+xy^{2})=1

Where the right side is the constraint (1), written alittle more nicely. So... what conclusions can I make out of this? If I subtract 1 from both sides of the inequality, I get an expression on the left which always is positive, independent of choice of (x,y) and on the right I just get something which is equal to zero. Is this enough to prove that there is no bound for the set of points (1), and that therefore, it is not compact (the curve is not of finite length)?

I find it hard to accept this kind of reasoning, since I do not clearly prove that there is no bound for the set of points (1). But I don't know guise. I would appreciate it if somebody could help me interpret this stuff, and maybe find a formal way of proving that that set of points is not compact...
 
Last edited:
Physics news on Phys.org
dobedobedo said:
PROBLEM STATEMENT:
Determine if f(x,y) = x^2+y^2 has a maximum and a minimum when we have the constraint 2x^3+3x^{2}y+3xy^{2}+2y^3=1. (1)

ATTEMPT TO SOLUTION:
A standard way of solving these kinds of problems is by using the Lagrangian multiplier-method. It consists of comparing the gradient of the function with the gradient of the constraint. However, sometimes, as in the case of with the constraint , one can rapidly observe that it lacks a maximum value. This is the case of the function f(x,y) = x^{2}y with the constraint x+y=1. As x approaches infinity, on that line x+y=1, the function approaches infinity. Is there any similar way to observe problem (1)? How do I prove that it has or hasn't a maximum?

I've used the Lagrangian multiplier method, and I get the solution x=\frac{1}{10^{1/3}} = |y|. I have not examined that it is correct, and I have not checked if it is a max or min.

How do I prove that this function has or has not a max or min?

[EDIT1:]
Okay. So we can interpret the question geometrically as: what is the smallest or largest distance from the origin to that curve? And: does there exist a smallest or largest distance from the origin to that curve? Hm... I can't visualize curve (1). If it is "closed", like an ellipse or circle, then there certainly has to exist a smallest and largest distance. But if it is like a line - which extends itself infinitely - there is a smallest distance, but not a largest. So the question is: how do I know the expression (1) is for a curve of finite length?

[EDIT2:]
Okay guise. The following inequalities are true:
(x^3-1)^2 = x^6+1-2x^3 \geq 0
\Rightarrow x^6+1 \geq 2x^3

(x^2-y) = x^4+y^2-2x^{2}y \geq 0
\Rightarrow \frac{3}{2} (x^4+y^2)\geq 3x^{2}y

Add a little symmetry-reasoning, and we get an inequality which can be used to study the constraint (1). We get:

2+x^6+y^6+\frac{3}{2}(x^4+y^4+x^2+y^2) \geq 2(x^3+y^3)+3(x^{2}y+xy^{2})=1

Where the right side is the constraint (1), written alittle more nicely. So... what conclusions can I make out of this? If I subtract 1 from both sides of the inequality, I get an expression on the left which always is positive, independent of choice of (x,y) and on the right I just get something which is equal to zero. Is this enough to prove that there is no bound for the set of points (1), and that therefore, it is not compact (the curve is not of finite length)?

I find it hard to accept this kind of reasoning, since I do not clearly prove that there is no bound for the set of points (1). But I don't know guise. I would appreciate it if somebody could help me interpret this stuff, and maybe find a formal way of proving that that set of points is not compact...

If there exists a finite constrained max or min, it must occur where the gradient of the Lagrangian = 0. (This is a theorem, which applies to any problem involving continuously-differentiable f and g, provided that the constraint obeys a "constraint qualification" at an optimal point---in this case, it just requires that the gradient of g should not be the zero vector at an optimum.) You can show that there is just ONE real solution to the optimality equations and the constraint, so it cannot be a maximum. (If it were a maximum the constraint set would be compact, and so there would also be a minimum---so there would have to be another Lagrangian solution.) So, the point you found (which is correct, by the way) must be a minimum, and there cannot be any maximum. Below, I will deal with the uniqueness problem for the equations; for now, let's convince ourselves there is no max.

If you change to polar coordinates x = r*cos(t), y = r*sin(t), you get g = r^3*C(t), where C(t) is a polynomial in cos(t) and sin(t). This polynomial has zeros in (0,2*pi)---just plot it and see. If t0 is such a zero of C(t), we can take t -> t0 and r -> ∞ in such a way that we keep r^3*C(t) = 1, and that means that we can have an unbounded sequence of points in the constraint set.

As for uniqueness: I will be lazy and let Maple compute a Groebner basis of the system: letting L = f + u*g be the Lagrangian (with Lagrange multiplier u instead of λ), let
sys = [Lx,Ly,g-1]. Here, we want the three functions in 'sys' to vanish. We can obtain a Groebner basis:
with(Groebner); <---- load the 'Groebner' package
B:=Basis(sys,plex(x,y,u)); <--- get a basis
B:=[-128-4968*u^3+18225*u^6, 360*u^2-1215*u^5-64*y+216*u^3*y, 104*u+135*u^4+216*u^2*y+96*y^2, -164*u^2+675*u^5+16*y+16*x]

What this means that roots of 'sys' occur among the roots of B and vice-versa.
The first entry of B is b1 = -128-4968*u^3+18225*u^6, and setting this to zero 0 implies either u = 2/3 or u = -(2/15)*(10)^(1/3) (or else u is complex). Substituting u = 2/3 in the
second component of B gives 0 identically, while substituting it into the third component of B gives 96*(y^2 + y + 1), whose two roots are complex. Thus, u = 2/3 does not lead to a real solution.

Therefore, we must have u = -(2/15)*(10)^(1/3), and substituting this into the second component of B gives a linear equation in y, with solution y = 1/(10)^(1/3). Substituting u into the third component of B gives a quadratic in y, one of whose roots is the value above. Finally, substitution u and y into the 4th component of B gives a linear equation in x with solution x = 1/(10)^(1/3). Therefore, x = 1/10^(1/3), y = 1/10^(1/3) and
u = -(2/15)*10^(1/3) s the only real solution.

Note: in principle one could compute a Groebner basis manually, but it would take you months of painstaking work and hundreds of pages of calculations. Using a computer algebra package is more-or-less a necessity.

RGV
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
8
Views
5K
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
2K
Replies
4
Views
4K
  • · Replies 11 ·
Replies
11
Views
2K