- #1

- 17

- 0

## Main Question or Discussion Point

This is not a homework problem. I've encountered something in my research in potential theory, and I need to prove the following.

Given:

h(t) = f(t) + g(t),

f '(t) < 0 for all t,

g'(t) > 0 for all t, and g'(t) is monotonically increasing,

f '(0) = 0,

g'(0) > 0,

f '(t) has exactly one minimum, and [tex]\lim_{t\to\infty}f'(t) = 0[/tex].

Show that h'(t) has at most three zeros.

Given:

h(t) = f(t) + g(t),

f '(t) < 0 for all t,

g'(t) > 0 for all t, and g'(t) is monotonically increasing,

f '(0) = 0,

g'(0) > 0,

f '(t) has exactly one minimum, and [tex]\lim_{t\to\infty}f'(t) = 0[/tex].

Show that h'(t) has at most three zeros.