1. The problem statement, all variables and given/known data Suppose f is differentiable on [0,1] and that f(0)=0. Prove that if f' is increasing on (0,1) then f(x)/x is increasing on (0,1) 2. Relevant equations 3. The attempt at a solution I've tried several things. I tried applying the definition of increasing on f', but I don't really get anywhere with that. I've used the Mean Value Theorem, but I'm not sure what to apply it to. I figured the interval (0,1) and I get there is a c in (0,1) such that f'(c)=f(1). I applied it again on the interval (0,c) to get another d such that f'(d)=f(c)/c. Since f' is increasing, then f(c)/c[tex]\leq[/tex]f(1)/1. I could apply it again and again for smaller and smaller intervals closer to 0, but I don't know how to prove it's increasing on the whole interval.