This is not a homework problem. I've encountered something in my research in potential theory, and I need to prove the following.(adsbygoogle = window.adsbygoogle || []).push({});

Given:

h(t) = f(t) + g(t),

f '(t) < 0 for all t,

g'(t) > 0 for all t, and g'(t) is monotonically increasing,

f '(0) = 0,

g'(0) > 0,

f '(t) has exactly one minimum, and [tex]\lim_{t\to\infty}f'(t) = 0[/tex].

Show that h'(t) has at most three zeros.

**Physics Forums - The Fusion of Science and Community**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Analysis problem for my research

Loading...

Similar Threads - Analysis problem research | Date |
---|---|

I A question about Vector Analysis problems | Oct 4, 2017 |

I Complex integral problem | Dec 7, 2016 |

I Problem with this estimation lemma example | Dec 6, 2016 |

A A problem about branch cut in contour integral | Dec 5, 2016 |

Partial differential problem in introductory tensor analysis | Nov 8, 2012 |

**Physics Forums - The Fusion of Science and Community**