Gradient Vector Proof for Local Minimizer: f(x)=0, Df(x)=0 | R^n --> R

Click For Summary
SUMMARY

The discussion centers on proving that the gradient vector Df(x) equals zero at a local minimizer x of the function f: Rn --> R, given that f has first-order partial derivatives. The proof involves defining a function q(t) = f(x + th) for |t| < r, where 0 is identified as an extreme point, leading to the conclusion that the derivative q'(0) = (df/dxi)(x) = 0. Participants emphasize the importance of correctly defining the direction vector h to derive the partial derivatives.

PREREQUISITES
  • Understanding of first-order partial derivatives
  • Knowledge of local minimizers in multivariable calculus
  • Familiarity with directional derivatives
  • Concept of extreme points in calculus
NEXT STEPS
  • Study the properties of first-order partial derivatives in multivariable functions
  • Learn about the implications of local minimizers in optimization problems
  • Explore the concept of directional derivatives and their applications
  • Investigate the relationship between extreme points and derivatives in calculus
USEFUL FOR

Students and educators in multivariable calculus, mathematicians focusing on optimization, and anyone interested in understanding the behavior of functions at local minima.

bubblesewa
Messages
4
Reaction score
0

Homework Statement



Suppose that the function f: Rn --> R has first-order partial derivatives and that the point x in Rn is a local minimizer for f: Rn --> R, meaning that there is a positive number r such that
f(x+h) > f(x) if dist(x,x+h) < r.
Prove that Df(x)=0.

Homework Equations



Df(x)=(df/dx1,df/dx2,...,df/dxn)

The Attempt at a Solution



We know that the function has first-order partial derivatives, which makes finding the gradient vector possible. And the definition for local minimizer is already given in the problem. I just need to prove that all partial derivatives are equal to zero. But how does knowing the local minimizer help me figure out the gradient vector?
 
Physics news on Phys.org
Welcome to PF!

bubblesewa said:
Suppose that the function f: Rn --> R has first-order partial derivatives and that the point x in Rn is a local minimizer for f: Rn --> R, meaning that there is a positive number r such that
f(x+h) > f(x) if dist(x,x+h) < r.
Prove that Df(x)=0.

I just need to prove that all partial derivatives are equal to zero. But how does knowing the local minimizer help me figure out the gradient vector?

Hi bubblesewa! Welcome to PF! :smile:

Find the directional derivative of f along each coordinate axis (keeping al the other coordinates constant) :wink:
 
Is this about what my proof should look like then? And thanks for welcoming me btw.

Since x is an interior point of Rn, we can choose a positive number r such that the open ball Br(x) is contained in Rn. Fix an index i with 1 < i < n. Then, do I suppose that I have some function, let's say q(t). Where q(t) = f(x+th) for |t| < r. Then the point 0 is an extreme point of the function q: (-r,r) --> R, so q'(0) = (df/dxi)(x) = 0.
 
bubblesewa said:
Then, do I suppose that I have some function, let's say q(t). Where q(t) = f(x+th) for |t| < r. Then the point 0 is an extreme point of the function q: (-r,r) --> R, so q'(0) = (df/dxi)(x) = 0.

Hi bubblesewa! :smile:

Yes, in principle that's right …

you define a line with that parameter t, and since, as you say, 0 is an extreme point of the line, the derivative along the line must be zero. :smile:

However, you haven't yet defined h so as to get the line that gives you ∂f/∂xi, have you? :wink:
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
12
Views
9K