Higher derivative means function is higher?

yairl
Messages
2
Reaction score
0
Hi,

Is there a theorem that says that if f(n) = g(n) and f'(x) >= g'(x) for each x > n, then it means that for each x>n f(x) >= g(x)? or is there a theorem that required more properties of g and f that implies so?

Thanks!
 
Physics news on Phys.org
It is implied that f' and g' exist...so clearly they must be differentiable...which implies continuity. If you don't have continuity then the condition that f(n) = g(n) doesn't matter, since f(n+1) could all of a sudden be less than g(n+1).

I think the general proof of this property calls in the mean value theorem, but it should be rather intuitive when you consider the definition of the derivative.
 
Do you consider continuously differentiable functions ##f, g : \mathbb{R} \to \mathbb{R}## such that for a certain ##n \in \mathbb{Z}## it holds that ##f(n) = g(n)## and ##f'(x) \ge g'(x)## for each ##x > n##?

If so, then of course for each ##x > n## it holds by the fundamental theorem of calculus that
$$
f(x) = f(n) + \int_n^x{f'(y)\,dy} \ge g(n) + \int_n^x{g'(y)\,dy} = g(x)
$$
Or did you have something else in mind?

In fact, you only need ##f'(x) \ge g'(x)## for a.e. ##x > n## and ##f'## and ##g'## locally integrable. Hence it suffices for ##f## and ##g## to be absolutely continuous on every interval. However, in this setting you must replace the Riemann integral by the Lebesgue integral.
 
Last edited:
Assuming that the function is continuous and differentiable at all x > n, it seems pretty simple to prove by contradiction. Assume that f(n) = g(n) and f'(x) ≥ g'(x) ∀(x > n) and further assume that ∃(a > n) such that f(a) < g(a). Then we have f(a) - g(a) < 0. Differentiating both sides, we get f'(a) - g'(a) < 0, or f'(a) < g'(a), which contradicts our initial assumption of f'(x) ≥ g'(x) ∀(x > n).

Caveat: I'm not a mathematician, so I might have missed something fundamental here.

Edit: I did miss something. Once you differentiate, you get f'(a) - g'(a) = 0, so the proof doesn't work.
Further edit: just disregard this entire post. See RUber's post below
 
Last edited:
TeethWhitener said:
Assuming that the function is continuous and differentiable at all x > n, it seems pretty simple to prove by contradiction. Assume that f(n) = g(n) and f'(x) ≥ g'(x) ∀(x > n) and further assume that ∃(a > n) such that f(a) < g(a). Then we have f(a) - g(a) < 0. .

I am with you up to here. The appropriate next step would be to use the mean value theorem, which states that for any continuous function f, with points f(n) and f(a), such that n<a, there exists a point b in [n,a] such that ##f'(b) =\frac{ f(a) - f(n) }{a - n} ##.

Differentiating both sides, we get f'(a) - g'(a) < 0, or f'(a) < g'(a), which contradicts our initial assumption of f'(x) ≥ g'(x) ∀(x > n).

Caveat: I'm not a mathematician, so I might have missed something fundamental here.
You cannot just pop a derivative onto a function and assume the relations will hold.
 
RUber said:
I am with you up to here. The appropriate next step would be to use the mean value theorem, which states that for any continuous function f, with points f(n) and f(a), such that n<a, there exists a point b in [n,a] such that ##f'(b) =\frac{ f(a) - f(n) }{a - n} ##.You cannot just pop a derivative onto a function and assume the relations will hold.

Yep, you're right. I screwed that one up.
 
Back
Top