1. The problem statement, all variables and given/known data By using mean value theorem , show that ln(1+x)<x whenever x>0 2. Relevant equations 3. The attempt at a solution So there is another example in my book and they just use the formula f ' (c) = f(b)-f(a) / b-a but Im not sure how to work out my [a,b] interval. any ideas?