Why the Taylor Series has a Factorial Factor

In summary, the factorial terms appear in the derivation of the Taylor series. The constant term is the nth derivative at 0. n! comes from the nth derivative of xn, which gives the constant term.
  • #1
matematikuvol
192
0
Why in Taylor series we have some factoriel ##!## factor.
[tex]f(x)=f(0)+xf'(0)+\frac{x^2}{2!}f''(0)+...[/tex]
Why we have that ##\frac{1}{n!}## factor?
 
Physics news on Phys.org
  • #2
matematikuvol said:
Why in Taylor series we have some factoriel ##!## factor.
[tex]f(x)=f(0)+xf'(0)+\frac{x^2}{2!}f''(0)+...[/tex]
Why we have that ##\frac{1}{n!}## factor?

Just take the nth derivative of the Taylor series expression. The constant term is the nth derivative at 0. n! comes from the nth derivative of xn, which gives the constant term.
 
  • #3
[/itex]
matematikuvol said:
Why in Taylor series we have some factoriel ##!## factor.
[tex]f(x)=f(0)+xf'(0)+\frac{x^2}{2!}f''(0)+...[/tex]
Why we have that ##\frac{1}{n!}## factor?

The factorial terms appear in the derivation of the Taylor series.

For instance, a derivation for a Taylor series around zero start with the equation,

[tex]f(x) - f(0) = ∫^{1}_{0}f'((1-t)x)xdt[/tex]By integration by parts this integral also equals,

[tex]f'((1-t)x)xt |^{1}_{0} + ∫^{1}_{0}tf''((1-t)x)x^{2}dt[/tex]

where the first term equals f'(0)x

Integrating by parts again, the integral term becomes,

[tex]f''((1-t)x)x^{2}\frac{t^2}{2!} |^{1}_{0} + ∫^{1}_{0}\frac{t^2}{2!}f'''((1-t)x)x^{3}dt[/tex]

Simplifying again, the first term becomes

[tex]\frac{x^2}{2!}f''(0)[/tex]

So so far one has [tex]f(x) - f(0) = xf'(0) + \frac{x^2}{2!}f''(0) + ∫^{1}_{0}\frac{t^2}{2!}f'''((1-t)x)x^{3}dt[/tex]

Continuing to integrate by parts gives you the first n terms in the Taylor series plus a remainder integral. The series converges if this remainder shrinks to zero as n goes to infinity. The factorials fall out inductively and arise because the anti-derivative of

[tex]\frac{t^n}{n!}[/tex] is [tex]\frac{t^{n+1}}{n+1!}[/tex]
 
Last edited:
  • #4
The problem is to reconstruct a function, given its value and derivatives at a single point, under nice conditions.

To give a baby example, suppose you know the velocity of an object is bounded by M. Then in t seconds, you know that it couldn't get farther than tM from the starting point.

To give another example, suppose the acceleration is bounded by M and suppose the initial velocity is zero. The velocity can't be more than tM in absolute value at time t, so the position can't be more than .5 t^2 M.

If M bounds the absolute value of the nth derivative, and all the previous derivatives and the function value vanish, you can play the same game and you get that function can grow no faster than M/n! t^n. So, your bound, M, doesn't have to be that great because it's getting divided by n!.

In the general case, you can subtract off a polynomial to reduce to the case above. What polynomial do you subtract from the function, so that all the derivatives up to the nth one are zero at the given point? The answer is the Taylor polynomials. You just want to come up with a polynomial whose derivatives are the same as the function, just at the given point. From there, you can refer to mathman's answer.

Under favorable conditions, for example, if ALL the derivatives are uniformly bounded by the SAME number, M, the n! will make the estimates get better and better as you pass to higher order polynomials and the Taylor series will converge to the function.
 
  • #5
because

[tex]\left(\dfrac{d}{dx}\right)^n \frac{x^n}{n!}=1[/tex]
 
  • #6
matematikuvol said:
Why in Taylor series we have some factoriel ##!## factor.
[tex]f(x)=f(0)+xf'(0)+\frac{x^2}{2!}f''(0)+...[/tex]

Not necessarily true, there are plenty of functions for which the taylor series doesn't equal the function. Even nice, non-piecewise ones like [itex]e^{-\frac1{x^2}}[/itex] (a famous one, the taylor series around x=0 is just 0.)

Anyways, here's the way I've always thought about it. Start differentiating it multiple times at x=0. You'll notice that both f and its taylor series have the same nth derivative at 0. Same derivatives "implies" (for most really simple, well-behaved functions) the same function.

Again, that was sort of ... not right, but it should give you some idea of why that factor exists.
 
  • #7
^sometimes it is better to use a symbol other than =
[tex]\mathrm{f}(x) \sim \sum^\infty_{k=0} \frac{x^k}{k!} \dfrac{d^k \mathrm{f}}{dx^k}[/tex]

to emphasize the point that an equivalence can be defined by the relation has the same Taylor series as.
 
  • #8
lurflurf's point is a good one. Even if a functions is infinitely differentiable, so that we can form its Taylor series, and even if that series converges for all x, the functions is not necessarily equal to it, except, of course, at the "initial point". The classic example is [itex]f(x)= e^{-1/x^2}[/itex] if x is not 0, f(0)= 0. It can be shown that this function is infinitely differentiable for all x, particularly at x= 0, and that all derivatives have the value "0" at x= 0. That is, its Taylor series about x= 0 (MacLaurin series) is identically equal to 0 for all x. But clearly f(x) is not identically 0. In fact it is 0 only at x= 0 so that is the only point the function is equal to its Taylor series.
 

1. Why is the factorial factor necessary in the Taylor series?

The factorial factor is necessary in the Taylor series because it helps to account for the higher order derivatives of a function. Without the factorial factor, the Taylor series would only include the first derivative of the function, making it less accurate for approximating the function at a given point.

2. How does the factorial factor affect the convergence of the Taylor series?

The factorial factor plays a crucial role in determining the convergence of the Taylor series. As the degree of the polynomial increases, the factorial factor grows exponentially, causing the terms of the series to decrease in magnitude faster. This leads to a more accurate approximation of the function and faster convergence.

3. Can the factorial factor be omitted in the Taylor series?

No, the factorial factor cannot be omitted in the Taylor series. As mentioned earlier, it is necessary to account for the higher order derivatives of the function and without it, the series would not accurately approximate the function at a given point.

4. How does the factorial factor relate to the concept of derivatives?

The factorial factor is closely related to the concept of derivatives as it represents the number of times a function has been differentiated. For example, the first derivative of a function would have a factorial factor of 1, the second derivative would have a factorial factor of 2, and so on.

5. Can the factorial factor have a negative value in the Taylor series?

No, the factorial factor cannot have a negative value in the Taylor series. This is because negative factorials are not defined in mathematics. The factorial function is only defined for non-negative integers, which corresponds to the number of times a function has been differentiated.

Similar threads

Replies
2
Views
1K
Replies
3
Views
918
Replies
3
Views
1K
Replies
3
Views
1K
Replies
2
Views
1K
Replies
17
Views
2K
  • Calculus
Replies
3
Views
1K
Replies
9
Views
1K
Back
Top