Maxima and Minima of a function of several variables

Sly37
Messages
6
Reaction score
0

Homework Statement



Find the maxima and minima of:

f(x,y)=(1/2)*x^2 + g(y)

g∈⊂ (δ⊂ ℝ )

in this region
Ω={(x,y)∈ℝ2 / (1/2)*x^2 + y^2 ≤ 1 }

hint: g: δ⊆ ℝ→ℝ

The absolute min of f in Ω is 0
The absolute max of f in Ω is 1

Homework Equations


The Attempt at a Solution



I have the parameterization of the region: √2 *cosθ and senθ
I also know ∇f(x,y)=(x,g`(y))=(0,0)
x=0, g´(y)=0
Hessian Matrix= [ 1 0
0 g``(y) ]
Determinat of the hessian matrix=g``(y)

How can I complete this problem if I don't have the function g(y)?, how can I find g(y)?
 
Physics news on Phys.org
g∈⊂ (δ⊂ ℝ )
What does that mean?

The new parameter set is for the border of Ω only? For maxima/minima inside, I would keep x and y. It is easy to find a condition for x there.

How can I complete this problem if I don't have the function g(y)?, how can I find g(y)?
That is strange, indeed.
 
mfb said:
What does that mean?

The new parameter set is for the border of Ω only? For maxima/minima inside, I would keep x and y. It is easy to find a condition for x there.

That is strange, indeed.

I believe that means that g belongs to a subset of ℝ.

Yes, the new parameter is just for the border.

All the possible critical points would have to be (0,?). and ? being the result gotten from g`(y)=0, right?
 
I don't recognize "belongs to" as a mathematical term. g is certainly not a subset of ℝ, as it is a function ℝ→ℝ.

All the possible critical points would have to be (0,?). and ? being the result gotten from g`(y)=0, right?
Right, assuming g is differentiable.
 
mfb said:
I don't recognize "belongs to" as a mathematical term. g is certainly not a subset of ℝ, as it is a function ℝ→ℝ.

Right, assuming g is differentiable.

That`s what my teacher told me, but you are right it doesn´t make sense.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top