- #1
blackbelt5400
- 17
- 0
This is not a homework problem. I've encountered something in my research in potential theory, and I need to prove the following.
Given:
h(t) = f(t) + g(t),
f '(t) < 0 for all t,
g'(t) > 0 for all t, and g'(t) is monotonically increasing,
f '(0) = 0,
g'(0) > 0,
f '(t) has exactly one minimum, and [tex]\lim_{t\to\infty}f'(t) = 0[/tex].
Show that h'(t) has at most three zeros.
Given:
h(t) = f(t) + g(t),
f '(t) < 0 for all t,
g'(t) > 0 for all t, and g'(t) is monotonically increasing,
f '(0) = 0,
g'(0) > 0,
f '(t) has exactly one minimum, and [tex]\lim_{t\to\infty}f'(t) = 0[/tex].
Show that h'(t) has at most three zeros.