Taylor Series for Any (x) = Function (x) for Any (x) ?

morrobay
Gold Member
Messages
1,116
Reaction score
1,693
When a Taylor Series is generated from a functions n derivatives at a single point,
then is that series for any value of x equal to the original function for any value x ?
For example graph the original function (x) from x= 0 to x = 10.
Now plug into the Taylor Expansion for x , values from 0 to 10 and graph.
Are the two plots approximate or equal ?
Numerical example not to be worked but just for question :
Suppose f(x) = 4x^3 + 8x^2 - 3x +2
 
Last edited:
Mathematics news on Phys.org
Hi morrobay! :smile:

In your example, you just have a polynomial, which means that the Taylor expansion will equal the function if the expansion is taken far enough (i.e. if you take 4 terms in the expansion).

In general, a polynomial will always equal the Taylor expansion if you take the expansion far enough.

A more interesting case are things like sin(x). The Taylor expansion of sine will never really equal the sine but it will converge to it. That is, the more terms in the expansion you take, the better the expansion will approximate the sine.
For example, if you take 6 terms, then you get quite a good approximation:

sin(x)\sim x-\frac{x^3}{3!}+\frac{x^6}{6!}

When you take the entire Taylor series (that is: when you take all the terms), then you get equality:

\sin(x)=\sum_{k=0}^{+\infty}{\frac{(-1)^nx^{2n+1}}{(2n+1)!}}

However, there are certain functions in which the Taylor expansions do not approximate the function well. Take the Taylor expansion of log(x) at 1. For points larger than 2, the Taylor expansions form very, very bad approximations of the function.
 
Even more interesting is the Taylor series for f(x)= e^{-1/x} if x is not 0, 0 if x= 0. That is infinitely differentiableat x= 0 and repeated derivatives are rational functions time e^{-1/x} if x is not 0, 0 if x is 0. That is, the Taylor series for this function exists and is identically 0 for all 0. Clearly, f(x) is not 0 except at x= 0 so this is a function whose Taylor series exist for all x but is not equal to the function value except at x= 0.

So, no, for general functions, the Taylor's series is not necessarily equal to the function value. The "analytic functions" are specifically defined to be those for which it is true: a function is analytic, at x= a, if and only if its Taylor's series at x= a is equal to the function value for all x in some neighborhood of a.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top