- #1
fluidistic
Gold Member
- 3,923
- 261
Homework Statement
Interpolate the function ln(x+1) using a Newton's polynomial of degree 2. The points to be interpolated are [itex]x_0=0[/itex], [itex]x_1=0.6[/itex] and [itex]x_2=0.9[/itex].
Homework Equations
[itex]p_2(x)=c_0+c_1(x-x_0)+c_2(x-x_0)(x-x_1)[/itex].
The Attempt at a Solution
So I used divided differences in order to calculate the coefficients of the polynomial.
[itex]f[x_1, x_2]=\frac{f(x_2)-f(x_1)}{x_2-x_1} \approx 0.1718502569[/itex].
[itex]f[x_0,x_1]\approx 0.7833393821[/itex].
[itex]f[x_0,x_1,x_2]=\approx -0.6794[/itex].
This gives [itex]c_0=0, c_1\approx 0.78[/itex] and [itex]c_2 \approx -0.68[/itex].
I've plotted both ln(x+1) and the polynomial and... it doesn't fit as it should. In fact it interpolates ln(x+1) in 0 and 0.6 but not in 0.9 as it should.
Where did I go wrong?! I've redone the algebra twice, even with a calculator and I'm still getting this wrong result.
Thanks for any help.