I'm computing the minimum number of terms for a Taylor polynomial to approximate f(1.5) within .0001 where f(x) = ln(x + 1) using Taylor's theorem, but I'm having a little trouble getting there. I keep coming up with the absolute value of the (n+1)th derivative of ln(x + 1) as (n!)/[(x+1)^(n+1)] in which case the largest value for any derivative of ln(x + 1) from 0 to x would be n! but if I use this with Taylor's Theorem I get (n!)[(1.5)^(n+1)] / (n+1)! < .0001 but this is not true for any n. Any help would be appreciated.