- #1
brad sue
- 281
- 0
Hi,
can someone explain me the relation between the degree of a taylor series (TS) and the error. It is for my class of numerical method, and I do not find a response to my question in my textbook.
I mean when we have a function Q with two variables x and y,and we use a version of TS to calculate the error of Q by doing:
∆ (Q(x,y))= (∂Q/∂x )*∆x + (∂Q/∂y )*∆y (1st order)
We want to compare the error of each term to know which is greater (the one in x or that in y.) or which one I need to measure with more precision.
I don't know if I am clear enough.
For example, if I have for the x term a degree of -2 and for y term a degree of -.5 after finding ∆ (Q(x,y)), considering the error which error is greater?
Thank you
Brad
can someone explain me the relation between the degree of a taylor series (TS) and the error. It is for my class of numerical method, and I do not find a response to my question in my textbook.
I mean when we have a function Q with two variables x and y,and we use a version of TS to calculate the error of Q by doing:
∆ (Q(x,y))= (∂Q/∂x )*∆x + (∂Q/∂y )*∆y (1st order)
We want to compare the error of each term to know which is greater (the one in x or that in y.) or which one I need to measure with more precision.
I don't know if I am clear enough.
For example, if I have for the x term a degree of -2 and for y term a degree of -.5 after finding ∆ (Q(x,y)), considering the error which error is greater?
Thank you
Brad