Hello everyone! I'm studying out of Spivak's calculus on my own and ran into a problem I can't explain on Theorem 1 of Chapter 11 (of the third edition). It's probably a very simple problem (Spivak calls it an easy theorem), but I'm still at a roadblock.(adsbygoogle = window.adsbygoogle || []).push({});

Spivak wants to prove that if f is a function defined on (a,b), f is differentiable at x, and x is a maximum (or minimum) point for f on (a,b), then f'(x) = 0.

Spivak shows that, if h>0, then [f(x+h) - f(x)] / h ≤ 0. This implies that the one-sided limit, as h approaches 0 from above, of [f(x+h) - f(x)] / h ≤ 0. (Sorry, I'm not sure about how to do the limit notation on the computer).

I wasn't sure how he made the leap that: because that function is ≤ 0, the one-sided limit is ≤ 0. I used a proof by contradiction to show that, if the one-sided limit exists, it must be ≤ 0, but this is assuming it exists. So my main issue is that I don't know how we can prove that the one-sided limit exists.

I was thinking of using the given fact that f is differentiable at x to show that the one-sided limit must exist. However, Spivak doesn't use this argument, and also I was thinking: even if the two-sided limit at a maximum did not exist, the one-sided limits would still exist, right? (but not be equal to each other). So I was wondering if there was any way to prove that the one-sided limit exists without using the differentiability of the function at x.

Thanks for any input

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Simple Calculus Question

**Physics Forums | Science Articles, Homework Help, Discussion**