This is starting to tick me off. I've emailed the professor about this. It seems like contradictory information. According to the textbook: If the derivative of a function is bigger than or equal to zero on an interval, it is increasing. If the derivative of a function is bigger than zero on an interval, it is strictly increasing. How does it not imply the derivative is bigger than zero if that's what's in the definition?