Exploring the Relationship between Taylor Series Degree & Error

In summary, the degree of a Taylor series can help determine the error in a function with two variables, x and y. By comparing the degree of each term, one can determine which variable has a greater impact on the error and should be measured with more precision. This is useful for numerical methods in calculus and analysis.
  • #1
brad sue
281
0
Hi,

can someone explain me the relation between the degree of a taylor series (TS) and the error. It is for my class of numerical method, and I do not find a response to my question in my textbook.

I mean when we have a function Q with two variables x and y,and we use a version of TS to calculate the error of Q by doing:

∆ (Q(x,y))= (∂Q/∂x )*∆x + (∂Q/∂y )*∆y (1st order)

We want to compare the error of each term to know which is greater (the one in x or that in y.) or which one I need to measure with more precision.

I don't know if I am clear enough.

For example, if I have for the x term a degree of -2 and for y term a degree of -.5 after finding ∆ (Q(x,y)), considering the error which error is greater?

Thank you

Brad
 
Physics news on Phys.org
  • #2
this has nothing to do with number theory; if no one moves it you may want to delete the thread and repost it in calculus and analysis
 
  • #3
matt grime said:
this has nothing to do with number theory; if no one moves it you may want to delete the thread and repost it in calculus and analysis

ok thank you
 

1. What is a Taylor series?

A Taylor series is a mathematical representation of a function as an infinite sum of terms. It is used to approximate a function by evaluating its derivatives at a single point.

2. How is a Taylor series degree related to error?

The degree of a Taylor series determines the number of terms used in the approximation, which affects the accuracy of the approximation. A higher degree generally results in a more accurate approximation with less error.

3. What factors can affect the relationship between Taylor series degree and error?

The choice of center point for the Taylor series, the choice of function being approximated, and the behavior of the function near the center point can all affect the relationship between Taylor series degree and error.

4. How is the error in a Taylor series approximation calculated?

The error in a Taylor series approximation can be calculated using the remainder term, which is the difference between the actual function value and the approximate value obtained from the Taylor series.

5. Can a Taylor series accurately represent any function?

No, a Taylor series can only accurately represent functions that are infinitely differentiable at the chosen center point. If a function has a discontinuity or sharp corner at the center point, the Taylor series will not accurately represent it.

Similar threads

Replies
3
Views
1K
Replies
17
Views
3K
Replies
3
Views
1K
Replies
3
Views
971
Replies
2
Views
1K
Replies
14
Views
1K
Replies
5
Views
12K
Replies
11
Views
2K
  • Calculus
Replies
8
Views
1K
Back
Top