Show that f(x) is constant

1. Dec 22, 2011

Jamin2112

1. The problem statement, all variables and given/known data

(I don't have my book with me, so this may not be the correct word-for-word representation of the exercise)

Suppose f(x) is differentiable on the whole real line. Show that f(x) is constant if for all real numbers x and y,

|f(x)-f(y)| ≤ (x-y)2.​

2. Relevant equations

Definition of a derivative

3. The attempt at a solution

f(x)-f(y) ≤ |f(x)-f(y)|, so by basic algebra we have [f(x)-f(y)]/(x-y) ≤ x - y. Letting x approach y on both sides of the inequality yields f '(x) ≤ 0.

........ Now I somehow need to show that f '(x) ≥ 0. Ideas?

2. Dec 22, 2011

I like Serena

Try using that:

|f(x)-f(y)| ≤ (x-y)2 = |x-y| |x-y|

3. Dec 22, 2011

Jamin2112

0 ≤ |f(x)-f(y)| ≤ (x-y)2 = |x-y| |x-y|

----> 0 ≤ |f(x)-f(y)| / |x-y| ≤ |x-y|
----> f '(x) = 0 by the Squeeze Theorem

My book uses limt→x (f(t)-f(x))/(t-x) for f '(x), which I guess is equivalent to limt→x |f(t)-f(x)|/|t-x|.

4. Dec 22, 2011

I like Serena

Yep. :)

The latter limit is actually |f'(x)|.

5. Dec 22, 2011

jgens

You actually do not need to use calculus to prove this result! In particular, fix x0,y0 and set δ = (y0-x0)n-1. Then we have:

|f(y0)-f(x0)| ≤ Ʃ|f(x0+iδ)-f(x0+(i-1)δ)| ≤ Ʃn-2 ≤ n-1

Since our choice of n was arbitrary, this forces |f(y0)-f(x0)| = 0, or equivalently f(y0) = f(x0).

Last edited: Dec 23, 2011
6. Dec 22, 2011

Jamin2112

But I need to show that f '(x) = 0.

Wut do?

7. Dec 22, 2011

JG89

8. Dec 22, 2011

I like Serena

Indeed. You're work is fine.

You showed that |f'(x)|=0 which implies that f'(x)=0.

9. Dec 23, 2011

Jamin2112

Thanks so much! I have another question.

Suppose f is defined and differentiable for every x > 0, and f '(x) → 0 as x → +∞. Put g(x) = f(x+1) - f(x). Prove that g(x) → 0 as x → +∞.

That g(x) → 0 as x → +∞ is quite obvious, and its proof should be too.

Attempt: Fix ε > 0. We seek a positive real number x' such that |f(x+1) - f(x)| < ε whenever x ≥ x'.

I also know that for any ∂ > 0 there exists a x* such that | limt→x (f(t) - f(x)) / (t-x) | < ∂ whenever x ≥ x*.

Where to go from here, I know not. Hint?

10. Dec 23, 2011

Dick

Try thinking about doing a proof by contradiction. Suppose g(x) does NOT go to zero. Can you use that to prove f'(x) does NOT go to zero?

11. Dec 23, 2011

I like Serena

The Mean value theorem states that for every interval [a,b] there is a value c in the interval (a,b) such that:
$$f'(c)={f(b)-f(a) \over b-a}$$
assuming f is differentiable.

Perhaps you can use that?

12. Dec 23, 2011

Jamin2112

Suppose g(x) does not approach 0 as x approaches ∞. Then there exists ε > 0 such that if x is any positive real number, there is another positive real number x' such that x' > x and g(x) = f(x+1) - f(x) ≥ ε. Using what I like Serena (Serena Williams?) said, there exists a point σ in (x, x+1) such f '(σ) = (f(x+1) - f(x)) / (x+1 - x) = f(x+1) - f(x).

Am I almost there? Is the final step glaring me in the face? I need to go watch John Stossel for an hour, then I'll be back.

13. Dec 23, 2011

Dick

If you'll agree that means |f'(sigma)|>=epsilon for arbitrarily large values of sigma, I think that would rule out f'(x) approaching 0 as x->infinity, wouldn't it? Let us know when you are back from Stossel. And I really can't condone watching Fox News. May hurt your mathematical skills.

Last edited: Dec 23, 2011
14. Dec 24, 2011

Jamin2112

You're silly. John Stossel is on Fox Business Network.

And yes, the proof by contradiction seems to have worked out. I'll hit you guys up with s'more questions later.

15. Dec 24, 2011

Jamin2112

New Question:

Suppose
(a) f is continuous for x ≥ 0,
(b) f'(x) exists for x > 0,
(c) f(0) = 0,
(d) f' is monotonically increasing.
Put
g(x) = f(x) / x (x > 0)​
and prove that g is monotonically increasing.

Proof (Attempt). By Theorem 5.11(a), it suffices to show that g'(x) ≥ 0 for all x > 0. By Theorem 5.3(c),

g'(x) = [x * f'(x) - f(x) * x'] / x2 = f'(x)/x - f'(x)/x2, ​

so we need to show that

f'(x)/x - f'(x)/x2 ≥ 0 for all x > 0,​

or equivalently

f'(x) ≥ f(x)/x for all x > 0.​

I'm a little stuck now. None of theorems that I invoked require that f be continuous anywhere [though of course differentiability implies continuity, so without (a) I'd still know that f is continuous on (a, b)]. I'm sure the next step has something to do with the continuity of f. Any suggestions?

16. Dec 24, 2011

I like Serena

Again the Mean value theorem can come to the rescue.
Can you apply it to f(x)/x?

Btw, the specific conditions like continuity at the boundary, are preconditions to the use of the theorem.
(Perhaps you should check those in the wikipedia article to get your proof complete.)

Last edited: Dec 24, 2011
17. Dec 29, 2011

Jamin2112

That's why I'm skeptical about the following proof (which I found on the internet): it invokes the Mean Value Theorem without the precondition of continuity at the boundary. What's the deal????

18. Dec 29, 2011

I like Serena

It is applicable in this case.
Since f is differentiable on (a,b), it is also continuous on (a,b).

Now consider the interval [x,y], which is a sub interval of (a,b).
It meets all conditions of the Mean value theorem.

19. Dec 29, 2011

Jamin2112

Ah, I see. Thanks so much for the clarification.

20. Dec 30, 2011

Jamin2112

Tell me if this is a good thorough proof. My professor is very stingy, so I can't be as willy-nilly as the other proof above.