Prove that the gradient is zero at a local minimum.

Bosley
Messages
10
Reaction score
0

Homework Statement


Suppose F: Rn --> R has first order partial derivatives and that x in Rn is a local minimizer of F, that is, there exists an r>0 such that
f(x+h) \geq f(x) if dist(x, x+h) < r. Prove that
\nabla f(x)=0.

Homework Equations


We want to show that fxi(x) =0 for i = 1,...,n
So we want to show that \lim_{t\to 0}\frac{f(x + t e_i) - f(x)}{t} = 0

Where e_i is the ith standard basis element.

The Attempt at a Solution


We know f(x+h) \geq f(x) if ||(x+h) - x|| <r, that is, if ||h|| < r.
Consider |t| < r. Then ||t ei|| = |t| < r.

So then f(x) \leq f(x + t ei) for all t such that |t| < r, and f(x+t ei) - f(x) \geq 0.

But I don't know where to go from here...insight?
 
Physics news on Phys.org
There may be a better way of doing this. In particular, let \hat x \in\mathbb R^n be your minimum. Define the function g: \mathbb R\to \mathbb R by g(t) = F(\hat x+te_i ).

What can you say about the minima of g? How does this help you?
 
Well, the minima of g would occur where
<br /> g&#039;(t) = \frac{dF}{dt}(\hat{x} + t e_i) = 0 I suppose, but I'm not sure how to employ that. Can you give me a little more of a hint? I'm not seeing what we can say about the derivative of F with respect to t, I guess.
 
Well, the minimum of F is \hat x right? So any other value of x would give F(x) \geq F(\hat x). In particular, what if we set x = \hat x + t e_i?

If that's still too esoteric, what happens to g(t) when we let t=0? What happens when t \neq 0?
 
Hmm...your post disappeared?
 
My tex code got screwed up and then I had to step away from the computer so I deleted it. Anyway:

g(0) = f(\hat{x})
g(t) = f(\hat{x} + t e_i) where t \neq 0
So g(t) \geq g(0) for all t.

But how is this different from what I originally had, which is that f(x + te_i) \geq f(x)? I'm sorry I'm having so much trouble putting together your hint.
 
It's okay.

The point here is that g has a minimum at 0. So in particular you can show very easily that \left. \frac d{dt} \right|_{t=0} g(t) = 0 [/tex], since this is only one dimensional right? I&#039;m not sure if you&#039;re allowed to assume this, but it&#039;s fairly easy to prove and you don&#039;t need to use vectors.<br /> <br /> Now try finding g&#039;(0) in terms of F.
 
Aha. I think I get it. Is this what you were getting at (note, I have slightly altered the notation):

Assume x is a local minimum of f.

Define g(h) = f(x + h e_i) considering small values of h (so that |h| < r)
Note that g(0) = f(x). So,
g(0) \leq g(h) \forall |h| &lt; r
That is, g has a local minimum at h = 0.
Since g is a function in one variable, we know that g'(0) = 0 since g has a local min at 0.

Then, <br /> g&#039;(h) = \lim_{t \to 0}\frac{g(h+t) - g(h)}{t} \\<br /> g&#039;(h) = \lim_{t\to 0} \frac{f(x+(h+t) e_i) - f(x+h e_i)}{t}

Since we know g'(0) = 0, plugging in h = 0 we get:
0 = \lim_{t\to0}\frac{f(x + t e_i) - f(x)}{t}, which is what we wanted to show.

eh?

(p.s. Thank you so much for your helpful hints.)
 
Back
Top