- #1
- 1,666
- 2
This is starting to tick me off. I've emailed the professor about this. It seems like contradictory information. According to the textbook:
If the derivative of a function is bigger than or equal to zero on an interval, it is increasing.
If the derivative of a function is bigger than zero on an interval, it is strictly increasing.
How does it not imply the derivative is bigger than zero if that's what's in the definition?
If the derivative of a function is bigger than or equal to zero on an interval, it is increasing.
If the derivative of a function is bigger than zero on an interval, it is strictly increasing.
Professor,
For HW11 #5(a), should it be show that f is increasing on I, not strictly increasing? The derivative of f is zero at pi/2. Strictly increasing requires the derivative to be greater than zero for all points on the interval.
f(x) = x^3 is strictly increasing on (-5, 5) and f'(0) = 0.
If f'(x) > 0 on I, then f is strictly increasing on I.
If f is strictly increasing on I, then f' is greater than or equal to 0 on I.
The book is making a distinction between strictly increasing and increasing. According to these definitions, #4 is strictly increasing, but #5 is only increasing. Just to clarify,
for #4, f(x) = x + 2(root2), and
for #5, f(x) = x - pi + cos x.
On page 249, problem 26.8 says f is increasing on I iff the derivative is bigger than or equal to 0 for all x in I.
On page 245, Theorem 26.8 says if the derivative is bigger than 0 for all x in I, then f is strictly increasing.
The bottom line is: strictly increasing does not imply f'(x)>0. See my example below.
How does it not imply the derivative is bigger than zero if that's what's in the definition?