fluidistic
Gold Member
- 3,928
- 272
Homework Statement
Interpolate the function ln(x+1) using a Newton's polynomial of degree 2. The points to be interpolated are x_0=0, x_1=0.6 and x_2=0.9.
Homework Equations
p_2(x)=c_0+c_1(x-x_0)+c_2(x-x_0)(x-x_1).
The Attempt at a Solution
So I used divided differences in order to calculate the coefficients of the polynomial.
f[x_1, x_2]=\frac{f(x_2)-f(x_1)}{x_2-x_1} \approx 0.1718502569.
f[x_0,x_1]\approx 0.7833393821.
f[x_0,x_1,x_2]=\approx -0.6794.
This gives c_0=0, c_1\approx 0.78 and c_2 \approx -0.68.
I've plotted both ln(x+1) and the polynomial and... it doesn't fit as it should. In fact it interpolates ln(x+1) in 0 and 0.6 but not in 0.9 as it should.
Where did I go wrong?! I've redone the algebra twice, even with a calculator and I'm still getting this wrong result.
Thanks for any help.