- #1
- 17
- 0
Let f(x,y) = e^(x^2 + y^2)
Use Taylor's theorem to show that the error in the linear approximation L(1,1)(x,y) is at most 5e^2[(x-1)^2+(y-1)^2] if 0 <= x <= 1, 0 <= y <= 1.
I've taken the partial and second partial derivatives and tried plugging it into this theorem but I get messy algebra and cannot simplify to the above.
Thanks guys.
Use Taylor's theorem to show that the error in the linear approximation L(1,1)(x,y) is at most 5e^2[(x-1)^2+(y-1)^2] if 0 <= x <= 1, 0 <= y <= 1.
I've taken the partial and second partial derivatives and tried plugging it into this theorem but I get messy algebra and cannot simplify to the above.
Thanks guys.