Gradient Vector Proof for Local Minimizer: f(x)=0, Df(x)=0 | R^n --> R

Click For Summary

Homework Help Overview

The discussion revolves around proving that the gradient vector Df(x) equals zero at a local minimizer x of the function f: Rn --> R, given that f has first-order partial derivatives. The context involves understanding the implications of the local minimizer condition on the behavior of the function's derivatives.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the relationship between the local minimizer condition and the gradient vector, with one participant attempting to define a function q(t) to analyze the behavior of f along a line. Questions arise regarding how to properly define the parameter h to derive the partial derivatives.

Discussion Status

The discussion is active, with participants exploring the implications of the local minimizer condition on the derivatives of the function. Some guidance has been offered regarding the directional derivative and the approach to defining the function q(t), but no consensus has been reached on the specific steps to take next.

Contextual Notes

Participants are navigating the definitions and implications of local minimizers and the necessary conditions for the gradient vector, with some uncertainty about the proper setup for their proofs.

bubblesewa
Messages
4
Reaction score
0

Homework Statement



Suppose that the function f: Rn --> R has first-order partial derivatives and that the point x in Rn is a local minimizer for f: Rn --> R, meaning that there is a positive number r such that
f(x+h) > f(x) if dist(x,x+h) < r.
Prove that Df(x)=0.

Homework Equations



Df(x)=(df/dx1,df/dx2,...,df/dxn)

The Attempt at a Solution



We know that the function has first-order partial derivatives, which makes finding the gradient vector possible. And the definition for local minimizer is already given in the problem. I just need to prove that all partial derivatives are equal to zero. But how does knowing the local minimizer help me figure out the gradient vector?
 
Physics news on Phys.org
Welcome to PF!

bubblesewa said:
Suppose that the function f: Rn --> R has first-order partial derivatives and that the point x in Rn is a local minimizer for f: Rn --> R, meaning that there is a positive number r such that
f(x+h) > f(x) if dist(x,x+h) < r.
Prove that Df(x)=0.

I just need to prove that all partial derivatives are equal to zero. But how does knowing the local minimizer help me figure out the gradient vector?

Hi bubblesewa! Welcome to PF! :smile:

Find the directional derivative of f along each coordinate axis (keeping al the other coordinates constant) :wink:
 
Is this about what my proof should look like then? And thanks for welcoming me btw.

Since x is an interior point of Rn, we can choose a positive number r such that the open ball Br(x) is contained in Rn. Fix an index i with 1 < i < n. Then, do I suppose that I have some function, let's say q(t). Where q(t) = f(x+th) for |t| < r. Then the point 0 is an extreme point of the function q: (-r,r) --> R, so q'(0) = (df/dxi)(x) = 0.
 
bubblesewa said:
Then, do I suppose that I have some function, let's say q(t). Where q(t) = f(x+th) for |t| < r. Then the point 0 is an extreme point of the function q: (-r,r) --> R, so q'(0) = (df/dxi)(x) = 0.

Hi bubblesewa! :smile:

Yes, in principle that's right …

you define a line with that parameter t, and since, as you say, 0 is an extreme point of the line, the derivative along the line must be zero. :smile:

However, you haven't yet defined h so as to get the line that gives you ∂f/∂xi, have you? :wink:
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
12
Views
9K