Prove that the gradient is zero at a local minimum.

Click For Summary

Homework Help Overview

The discussion revolves around proving that the gradient of a function is zero at a local minimum. The function F is defined from Rn to R and is assumed to have first order partial derivatives. The original poster presents a scenario where x is a local minimizer of F, leading to the goal of demonstrating that the gradient at that point is zero.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the implications of defining a new function g based on the local minimum and question how to relate the derivative of g to the gradient of F. There are discussions about the conditions under which g has a minimum and how that relates to the original function F.

Discussion Status

Participants are actively engaging with hints and exploring different approaches to the problem. Some have made connections between the properties of the function g and the requirements for showing that the gradient of F is zero, while others are seeking further clarification on specific steps in the reasoning process.

Contextual Notes

There are indications of confusion regarding the transition from multi-variable to single-variable analysis, as well as the assumptions that can be made about the differentiability of the functions involved. The original poster and others express uncertainty about how to proceed with the hints provided.

Bosley
Messages
10
Reaction score
0

Homework Statement


Suppose F: Rn --> R has first order partial derivatives and that x in Rn is a local minimizer of F, that is, there exists an r>0 such that
f(x+h) \geq f(x) if dist(x, x+h) < r. Prove that
\nabla f(x)=0.

Homework Equations


We want to show that fxi(x) =0 for i = 1,...,n
So we want to show that \lim_{t\to 0}\frac{f(x + t e_i) - f(x)}{t} = 0

Where e_i is the ith standard basis element.

The Attempt at a Solution


We know f(x+h) \geq f(x) if ||(x+h) - x|| <r, that is, if ||h|| < r.
Consider |t| < r. Then ||t ei|| = |t| < r.

So then f(x) \leq f(x + t ei) for all t such that |t| < r, and f(x+t ei) - f(x) \geq 0.

But I don't know where to go from here...insight?
 
Physics news on Phys.org
There may be a better way of doing this. In particular, let \hat x \in\mathbb R^n be your minimum. Define the function g: \mathbb R\to \mathbb R by g(t) = F(\hat x+te_i ).

What can you say about the minima of g? How does this help you?
 
Well, the minima of g would occur where
<br /> g&#039;(t) = \frac{dF}{dt}(\hat{x} + t e_i) = 0 I suppose, but I'm not sure how to employ that. Can you give me a little more of a hint? I'm not seeing what we can say about the derivative of F with respect to t, I guess.
 
Well, the minimum of F is \hat x right? So any other value of x would give F(x) \geq F(\hat x). In particular, what if we set x = \hat x + t e_i?

If that's still too esoteric, what happens to g(t) when we let t=0? What happens when t \neq 0?
 
Hmm...your post disappeared?
 
My tex code got screwed up and then I had to step away from the computer so I deleted it. Anyway:

g(0) = f(\hat{x})
g(t) = f(\hat{x} + t e_i) where t \neq 0
So g(t) \geq g(0) for all t.

But how is this different from what I originally had, which is that f(x + te_i) \geq f(x)? I'm sorry I'm having so much trouble putting together your hint.
 
It's okay.

The point here is that g has a minimum at 0. So in particular you can show very easily that \left. \frac d{dt} \right|_{t=0} g(t) = 0 [/tex], since this is only one dimensional right? I&#039;m not sure if you&#039;re allowed to assume this, but it&#039;s fairly easy to prove and you don&#039;t need to use vectors.<br /> <br /> Now try finding g&#039;(0) in terms of F.
 
Aha. I think I get it. Is this what you were getting at (note, I have slightly altered the notation):

Assume x is a local minimum of f.

Define g(h) = f(x + h e_i) considering small values of h (so that |h| < r)
Note that g(0) = f(x). So,
g(0) \leq g(h) \forall |h| &lt; r
That is, g has a local minimum at h = 0.
Since g is a function in one variable, we know that g'(0) = 0 since g has a local min at 0.

Then, <br /> g&#039;(h) = \lim_{t \to 0}\frac{g(h+t) - g(h)}{t} \\<br /> g&#039;(h) = \lim_{t\to 0} \frac{f(x+(h+t) e_i) - f(x+h e_i)}{t}

Since we know g'(0) = 0, plugging in h = 0 we get:
0 = \lim_{t\to0}\frac{f(x + t e_i) - f(x)}{t}, which is what we wanted to show.

eh?

(p.s. Thank you so much for your helpful hints.)
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 105 ·
4
Replies
105
Views
8K
Replies
30
Views
3K
Replies
4
Views
2K
Replies
16
Views
2K
Replies
2
Views
1K