Taylor Series: Equivalence of Two Forms Explained

IMGOOD
Messages
32
Reaction score
0
I don't get how these two forms of the taylor series are equivalent:

f(x+h)= \sum_{k=0}^{\infty} \frac{f^k(x)}{k!} h^k

f(x) = \sum_{k=0}^{\infty} \frac{f^k(0)}{k!}x^k

The second one makes sense but I just can't derive the first form using the second. I know its something very simple but I keep confusing myself!
 
Last edited:
Physics news on Phys.org
Replace x by x' in the second eqn.
Then the second eqn. is gotten from the first by

x=0
x'=h
 
Or, let g(h)=x+h. Then, f(x+h)=(f o g)(h).

The first equation is the Taylor (MacLaurin) series of (f o g)(h).

(f\circ g)^{(k)}(0) = (f^{(k)}\circ g)(0)\cdot 1=f^{(k)}(x+0)=f^{(k)}(x)

\Rightarrow f(x+h)=\sum_{k=0}^{\infty} \frac{f^{(k)}(x)}{k!}h^k
 
quasar987 said:
(f\circ g)^{(k)}(0) = (f^{(k)}\circ g)(0)\cdot 1=f^{(k)}(x+0)=f^{(k)}(x)
Your approach makes sense but could you explain the above line in a little more detail. Specifically, I don't get how you got f^{(k)}(x+0) in the above equation.
 
Last edited:
(f^{(k)}\circ g)(0)\cdot 1=f^{(k)}(g(0))=f^{(k)}(x+0)=f^{(k)}(x).

Sorry for steping in, I was bored.
 
Thanks!...
 
quasar987 said:
Or, let g(h)=x+h. Then, f(x+h)=(f o g)(h).

The first equation is the Taylor (MacLaurin) series of (f o g)(h).

(f\circ g)^{(k)}(0) = (f^{(k)}\circ g)(0)\cdot 1=f^{(k)}(x+0)=f^{(k)}(x)

\Rightarrow f(x+h)=\sum_{k=0}^{\infty} \frac{f^{(k)}(x)}{k!}h^k

Actually, I am still kinda confused. I know now how you got f^{(k)}(x) but how did you get h^k?
 
Because we'Re computing the Taylor series of a function of h. Recall, I set g(h)=x+h, a function of h. x is considered constant. And this h dependence is passed on to f(x+h): f(x+h)=(f o g)(h).
 
Taylor series are something that is never written consistently, with some authors choosing to expand the series in x about zero, and others choosing to expand the series in h(or a) about x. Still more choose to evaluate the function at x, with the series expanded around a point a, and the powers being of (x-a). So you can see lots of things like:

f(x) = \sum_{k=0}^{\infty} \frac{f^k(0)}{k!}x^k

f(x+h)= \sum_{k=0}^{\infty} \frac{f^k(x)}{k!} h^k

f(x)= \sum_{k=0}^{\infty} \frac{f^k(a)}{k!} (x-a)^k

Now personally, I prefer to expand in x about the point a, as there is no guarantee that you will be able to expand about zero. The function may not even be defined there, or may be singular. Secondly, it's nice to be able to just write f(x), and not have to worry too much about the "arbitrary but fixed" point a (detest this phrase). You can keep the regular f(x) notation and then just change a at will.

By the way, anyone interested in Taylor series in higher dimensions should look into the rather nice multi-index notation for multi-variable analysis. It enables you to wite things like.

f(\mathbf{x})=\sum_{|\grave{\alpha}| \ge 0} \frac{\mathbf{D}^{\grave{\alpha}}f(\mathbf{x})}{\grave{\alpha}!} (\mathbf{x}-\mathbf{a})^{\grave{\alpha}}

Here \mathbf{x} is vector of n variables (x_1,x_2,...x_n), and \grave{\alpha} is... complicated. It's called multi-index notation and is very useful for compacting the oft times awkward Taylor expansions in n dimensions. It takes a bit of getting used to but is worth it.

My personal favourite is it's compression of the inherantly forgettable, but undeniably useful multinomial expansion.
(x_1 + x_2 + \cdots + x_n )^k = k! \sum_{|\grave{\alpha}|=k} \frac{\mathbf{a}^{\grave{\alpha}}}{\grave{\alpha}!}

Which is a good deal more memorable than the usual expansion.
 
Last edited:
  • #10
For taylor series in multiple dimensions, the easiest form to use is the operator

e^{\mathbf{r}.\hat{\nabla}

which gives

e^{\mathbf{r}.\hat{\nabla}} f(0)=f(\mathbf{r})
 
Back
Top