# Understanding Taylor Series Error & Degrees of Variables

In summary, the relationship between the degree of a Taylor series and the error can be determined by comparing the error of each term in the series, specifically by looking at the degrees of the terms in relation to each other. This can be done by calculating the exact value and comparing it to the approximation, or by considering the next largest terms in the series.
Hi,

can someone explain me the relation between the degree of a taylor series (TS) and the error. It is for my class of numerical method, and I do not find a response to my question in my textbook.

I mean when we have a function Q with two variables x and y,and we use a version of TS to calculate the error of Q by doing:

∆ (Q(x,y))= (∂Q/∂x )*∆x + (∂Q/∂y )*∆y (1st order)

We want to compare the error of each term to know which is greater (the one in x or that in y.) or which one I need to measure with more precision.

I don't know if I am clear enough.

For example, if I have for the x term a degree of -2 and for y term a degree of -.5 after finding ∆ (Q(x,y)), considering the error which error is greater?

Thank you

The best thing would be to calculate the exact value, and compare that to the approximation.

As far as an analytical solution, just calculate the next largest terms:

$$\Delta Q = \frac{\partial Q}{\partial x} \Delta x + \frac{\partial Q}{\partial y} \Delta Y + \frac{\partial^2 Q}{\partial x^2} (\Delta x )^2 + \frac{\partial^2 Q}{\partial y^2} (\Delta y)^2 +\frac{\partial^2 Q}{\partial x \partial y} \Delta x \Delta y$$

Compare the delta x squared term to the corresponding y term. Use the mixed term to calculate the whole thing to second order, then compare that to your first order approx.

The degree of a Taylor series refers to the highest exponent of the variable in the polynomial approximation. For example, a Taylor series with degree 3 would have terms up to x^3, while a Taylor series with degree 5 would have terms up to x^5. The higher the degree, the more accurate the approximation will be.

In terms of error, the degree of a Taylor series can give us an idea of how quickly the error decreases as we increase the number of terms in the series. Generally, the higher the degree, the faster the error decreases. So in your example, the error in the x term (degree -2) would decrease faster than the error in the y term (degree -0.5).

However, it's important to note that the degree of a Taylor series is not the only factor that affects the error. The size of the interval and the smoothness of the function also play a role. So while the degree can give us some information about the error, it's not the only factor to consider when determining which term to measure with more precision.

I hope this helps clarify the relationship between the degree of a Taylor series and the error. Let me know if you have any other questions. Good luck with your class!

Best regards,

## 1. What is a Taylor series?

A Taylor series is a mathematical tool used to represent a function as an infinite sum of polynomial terms. It is named after British mathematician Brook Taylor and is particularly useful for approximating complex functions.

## 2. What is the purpose of Taylor series?

The main purpose of Taylor series is to approximate a complex function with a simpler one. This allows for easier analysis and manipulation of the function, and can also be used to find the value of a function at a certain point.

## 3. What is the error in a Taylor series approximation?

The error in a Taylor series approximation is the difference between the value of the original function and the value of the approximated function. This error is typically smaller when more terms are included in the series.

## 4. What determines the degree of the variables in a Taylor series?

The degree of the variables in a Taylor series is determined by the order of the derivative used in the series. For example, a second order Taylor series will have terms up to the second derivative of the function.

## 5. How can Taylor series be used to improve the accuracy of calculations?

Taylor series can be used to improve the accuracy of calculations by providing a better approximation of a function. By including more terms in the series, the approximation becomes closer to the actual value of the function, resulting in more accurate calculations.

• Calculus
Replies
9
Views
822
• Calculus
Replies
3
Views
1K
• Calculus
Replies
3
Views
1K
• Calculus
Replies
3
Views
968
• Calculus
Replies
17
Views
3K
• Calculus
Replies
2
Views
1K
• Calculus
Replies
5
Views
12K
• Calculus
Replies
14
Views
1K
• General Math
Replies
3
Views
679
• Calculus
Replies
11
Views
2K