- #1
- 28
- 0
Homework Statement
The Taylor expansion of ln(1+x) has terms which decay as 1/n.
Show, that by choosing an appropriate constant 'c', the Taylor series of
(1+cx)ln(1+x)
can be made to decay as 1/n2
(assuming expansion about x=0)
Homework Equations
f(x)=[itex]\sum[/itex][itex]^{n=\infty}_{n=0}[/itex] f(n)(0) [itex]\frac{x^{n}}{n!}[/itex]
The Attempt at a Solution
I used Maple to differentiate this function and find values at x=0 for several derivative:
f(0)(0) = 0
f(1)(0) = 1
f(2)(0) = 2c-1
f(3)(0) = -3c+2
f(4)(0) = 8c-6
f(5)(0) = -30c+24
f(x)=[itex]\frac{(0)x^{0}}{0!}[/itex]+[itex]\frac{(1)x^{1}}{1!}[/itex]+[itex]\frac{(2c-1)x^{2}}{2!}[/itex]+[itex]\frac{(-3c+2)x^{3}}{3!}[/itex]+[itex]\frac{(8c-6)x^{4}}{4!}[/itex]+[itex]\frac{(-30c+24)x^{5}}{5!}[/itex] ...
This is where I'm stuck... In order to get the terms decaying as 1/n2, I get different values of c for each term...
c0=1
c1=1
c2=[itex]\frac{3}{4}[/itex]
c3=[itex]\frac{8}{9}[/itex]
c4=[itex]\frac{15}{16}[/itex]
c5=[itex]\frac{24}{25}[/itex]
And I need one constant c that will do it all. Any help would be greatly appreciated.