A problem about differentiability

  • Thread starter Thread starter rasi
  • Start date Start date
  • Tags Tags
    Differentiability
Click For Summary
SUMMARY

The discussion centers on the problem of differentiability, specifically analyzing the behavior of a function f(x) that oscillates between -1 and 1. The key inequality f'(x)^2 + f''(x)^2 ≤ 1 indicates that the first and second derivatives cannot both be large simultaneously, which leads to contradictions if f(x) approaches the bounds of its oscillation. The challenge lies in rigorously demonstrating how the first derivative f'(x) must decrease to zero without violating the established constraints.

PREREQUISITES
  • Understanding of calculus concepts, particularly derivatives and their properties.
  • Familiarity with the behavior of oscillating functions.
  • Knowledge of inequalities involving derivatives.
  • Experience with geometric interpretations of mathematical problems.
NEXT STEPS
  • Study the implications of the inequality f'(x)^2 + f''(x)^2 ≤ 1 in more depth.
  • Explore geometric interpretations of differentiability in oscillating functions.
  • Investigate advanced calculus techniques for proving properties of derivatives.
  • Learn about theorems related to the behavior of functions constrained by their derivatives.
USEFUL FOR

Mathematics students, calculus instructors, and researchers in mathematical analysis who are interested in the properties of differentiable functions and their oscillatory behavior.

rasi
Messages
18
Reaction score
0
i tried to solve this problem. i can do it a little. but i can't progress. as far as I'm concerned, it requires outstanding performance. thanks for now...
PROBLEM
5625.jpg

MY SOLUTION...
5625_ms_s.jpg

 
Physics news on Phys.org
This is pretty tricky (unless I missed something clever). Think about this geometrically: we know that f(x) is oscillating between -1 and 1 in some fashion. The statement
[tex]f'(x)^2+f''(x)^2 \leq 1[/tex]
says that the derivative of f and the second derivative cannot both be big at the same time. So for example if f'(x) is big when f(x) is near 1, it won't be able to decrease fast enough (because f''(x) is small) to prevent f(x) from crossing the value of 1, which gives a contradiction.

The question then is how to put this into something more rigorous. Suppose we're at a point a for which f(a)2+f'(a)2>1. The picture you should have in your head (not something you should assume but something to give you an idea of wha's going on) is f(a) and f'(a) both positive. Then f is going to eventually reach the value of 1 unless f'(x) decreases and is eventually 0. So the question that you need to answer: how fast can f'(x) decrease to reach 0, given that f'(x)2+f''(x)2<1?
 
Mod note: This is obviously not a precalc problem, so moving it to the Calculus & Beyond section.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
7
Views
2K
Replies
10
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K