Optimizing Functions: Strategies and Considerations

  • Thread starter Thread starter tcuay
  • Start date Start date
  • Tags Tags
    Optimization
Click For Summary

Homework Help Overview

The discussion revolves around optimizing functions, specifically focusing on finding local and global extrema. Participants are addressing two questions that involve applying gradient methods, Hessian matrices, and conditions for local maxima and minima.

Discussion Character

  • Exploratory, Assumption checking, Conceptual clarification

Approaches and Questions Raised

  • Participants discuss methods for finding stationary points and the implications of Hessian determinants. There is uncertainty about how to determine if local optima are also global optima, and questions arise regarding the conditions under which these methods can be applied.

Discussion Status

Some participants express frustration with their attempts to solve the problems, while others provide insights into the nature of global and local optima. There is an ongoing exploration of the relationship between local extrema and global behavior, with some guidance offered regarding the necessity of examining higher derivatives.

Contextual Notes

Participants note that both questions are required to find global optima, which implies the existence of finite global extrema. However, there is confusion regarding the assumptions about the existence of these extrema and the methods to verify them.

tcuay
Messages
8
Reaction score
0
These 2 questions,I have attempted them for hours but still no outcome:confused:
For the first question:
part a) i take the ∇f(x) and set it to be zero;
then find out(x1,x2)=(0,-1) or((-2a-1)/3,(-2a-4)/3);
but then for part b), after using second-order necessary condition,
i have no idea to continue;
part c): i know if i can prove the function is convex or concave, i can surely conclude the local minimum/maximum is a global minimum/maximum; but after taking the gradient for 2 times, i don't know to work for it in further;

For Question 2,
part a) i got 5 possible points (x1,x2)=(0.5,0),(1,-1),(1,1),(-1,-sqrt(3)),(-1,sqrt(3));
but in part b) i can only prove (0.5,0) is a local maximizer while other 4 points don't satisfy the 2nd conditions(because their determinant of Hessian Matrix is smaller than zero;)
part c) same, no idea how to do it;
part d) i find the directional vectors are both [0,0]'; does it mean f(1,-1) is a local minimum?

Thanks for help:frown:
 

Attachments

  • optimization.png
    optimization.png
    17.8 KB · Views: 486
  • Q1.png
    Q1.png
    41 KB · Views: 488
  • Q2.png
    Q2.png
    47 KB · Views: 472
Physics news on Phys.org
no body can help?:cry::cry:
 
tcuay said:
no body can help?:cry::cry:

Looking for stationary points and examining Hessians only works if the function has a (local) max or min at the point of interest. You can only apply these methods to global optima if you know that the functions have global optima, so first you need to examine that issue: does the function have a finite global max? Global min?
 
Ray Vickson said:
Looking for stationary points and examining Hessians only works if the function has a (local) max or min at the point of interest. You can only apply these methods to global optima if you know that the functions have global optima, so first you need to examine that issue: does the function have a finite global max? Global min?
First of all, really thanks for your reply;
These 2 questions both are required to find their global optima, it implies they should have a finite global optima; but what i am confused is how to determine whether those local optima are also global optima? And Even though i know the methods to find local optima.( e,g first order necessary condition, second order condition necessary/sufficient condition...), but i still cannot solve these 2 problems
Very difficult to me. Could you spend some more time to help me solve them :frown:
 
tcuay said:
First of all, really thanks for your reply;
These 2 questions both are required to find their global optima, it implies they should have a finite global optima; but what i am confused is how to determine whether those local optima are also global optima? And Even though i know the methods to find local optima.( e,g first order necessary condition, second order condition necessary/sufficient condition...), but i still cannot solve these 2 problems
Very difficult to me. Could you spend some more time to help me solve them :frown:

No, I cannot do any more, because that would involve my solving the problem for you. I will just say that your reasoning for the existence of global max or min is faulty; just because somebody asks you to find it does not mean it exists--it may, or may not!
 
tcuay said:
These 2 questions both are required to find their global optima, it implies they should have a finite global optima;
As Ray says, don't deduce that.
They are both smooth functions, defined everywhere. If a global extremum is not also a local extremum, where is it?
 
Okay, you are right. Then i also want to ask, for Q2 part d, if i find a steepest descent directional vector at a point, i.e x*, is zero; can i say that the point is a local minimizer? can the directional vector be zero when the point is a saddle point or maximum point?
 
tcuay said:
Okay, you are right.
Umm.. about what exactly?
When I wrote
If a global extremum is not also a local extremum, where is it?
there is an answer to that question.
can the directional vector be zero when the point is a saddle point or maximum point?
Yes, it will be. The grad is zero if and only if the tangent plane is horizontal. To know whether it's a min, a max, a saddle, or even a horizontal valley or ridge, you need to look at higher derivatives.
 
  • Like
Likes   Reactions: 1 person
haruspex said:
Umm.. about what exactly?
When I wrote

there is an answer to that question.

Yes, it will be. The grad is zero if and only if the tangent plane is horizontal. To know whether it's a min, a max, a saddle, or even a horizontal valley or ridge, you need to look at higher derivatives.
Okay, thanks very much;
And all answers are solved, thanks for your reminder;
 

Similar threads

Replies
30
Views
3K
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 31 ·
2
Replies
31
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
2
Views
2K
Replies
2
Views
1K