- #1

- 520

- 2

I am looking for a theorem that says if a function satisfies these conditions then it has a power series representation. Or does all functions have a power series representation?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter tgt
- Start date

- #1

- 520

- 2

I am looking for a theorem that says if a function satisfies these conditions then it has a power series representation. Or does all functions have a power series representation?

- #2

- 520

- 2

- #3

HallsofIvy

Science Advisor

Homework Helper

- 41,833

- 963

A function (over the real numbers) is equal to its power series representation in some open interval about a point if and only if it is "analytic" at that point. That's really just the definition of "analytic" so it doesn't answer the original question. However, a function (on the real numbers) is analytic at x= a if and only if it can be extended to a function on the complex number plane in some neighborhood of a such that it complex-analytic and there are a number of criteria for that.

- #4

quasar987

Science Advisor

Homework Helper

Gold Member

- 4,783

- 18

http://en.wikipedia.org/wiki/Radius_of_convergence

If a function has derivatives of all order at a point a, then we can form the Taylor series of the function about that point, which is the best bet for matching the function with a power series. If the radius of convergence of the power series thus obtaine is non zero, then the function can be reprensented by a power series about that point. If the radius of convergence is infinite, then the function equals its Taylor series expansion everywhere.

- #5

- 520

- 2

http://en.wikipedia.org/wiki/Radius_of_convergence

If a function has derivatives of all order at a point a, then we can form the Taylor series of the function about that point, which is the best bet for matching the function with a power series. If the radius of convergence of the power series thus obtaine is non zero, then the function can be reprensented by a power series about that point. If the radius of convergence is infinite, then the function equals its Taylor series expansion everywhere.

What if all derivatives exist at a point 'a' but the radius of convergence is 0? We still have a taylor series expansion?

- #6

HallsofIvy

Science Advisor

Homework Helper

- 41,833

- 963

Yes, we **have** a Taylor's series, but if it does not converge for non-zero x, in what sense does it "represent" the original function or is an "expansion" of that function?

Actually, it is**not** simply a matter of "radius of convergence". It is quite possible for a function to have a Taylor's series at a given point which has **non-zero** radius of convergence but does NOT converge to the original function. Again, in what sense does that "represent" the original function?

For example, the function, f(x) defined by

[tex]f(x)= e^{-\frac{1}{x^2}}[/tex]

if x is non-zero while f(0)= 0. It is relatively easy to show that f is infinitely differentialble for all x and f and all of its derivatives are equal to 0 at x= 0. The Taylor's series for f at x= 0 is identically equal to 0, which converges for**all** x, while f itself in 0 only at x= 0.

(Edited so I can pretend I didn't make the mistake of saying "1/x^{2}" rather than "-1/x^{2}"! Thanks maze.)

Actually, it is

For example, the function, f(x) defined by

[tex]f(x)= e^{-\frac{1}{x^2}}[/tex]

if x is non-zero while f(0)= 0. It is relatively easy to show that f is infinitely differentialble for all x and f and all of its derivatives are equal to 0 at x= 0. The Taylor's series for f at x= 0 is identically equal to 0, which converges for

(Edited so I can pretend I didn't make the mistake of saying "1/x

Last edited by a moderator:

- #7

- 655

- 3

[tex]f\left(x\right)=e^{-\frac{1}{x^2}}[/tex]

Functions similar to this are used extensively as mollifiers ("bump functions") to create a smooth function with compact support.

http://en.wikipedia.org/wiki/Non-analytic_smooth_function

When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

Non-analytic smooth functions are not like this - they could be zero-zero-zero-zero- and then go off and turn into a parabola, then go turn into an exponential, then go do whatever they like.

Functions similar to this are used extensively as mollifiers ("bump functions") to create a smooth function with compact support.

http://en.wikipedia.org/wiki/Non-analytic_smooth_function

When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

Non-analytic smooth functions are not like this - they could be zero-zero-zero-zero- and then go off and turn into a parabola, then go turn into an exponential, then go do whatever they like.

Last edited:

- #8

- 1,707

- 5

- #9

- 655

- 3

Note that with the above example there is some division by 0 going on with exp(-1/x^2), so technically you have to make it piecewise to add in the point 0 at 0 so it is not a simple composition.

- #10

- 1,707

- 5

- #11

- 108

- 1

When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

Thats a really cool analogy

- #12

- 2,111

- 18

The complex analysis then gives the obvious way to deal with Taylor series of the real analytic functions too. When you are given a function [tex]f:[a,b]\to\mathbb{R}[/tex], extend it to a complex analytic function [tex]f:B((a+b)/2, (b-a)/2)\to\mathbb{C}[/tex], you know it has Taylor series representation, then restrict the Taylor series back to the real line.

Suppose you want to know that Taylor series of [tex]\log:]0,2[\to\mathbb{R}[/tex] around [tex]x=1[/tex] converges towards the logarithm. We know that [tex]\log:B(1,1)\to\mathbb{C}[/tex], [tex]\log(z)=\log(|z|) + i \textrm{Arg}(z)[/tex], has the Taylor series representation, so the proof is done.

What happens if the function cannot be extended to a complex analytic function? Then the Taylor series are not converging right.

For example you cannot extend [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=1/(1+x^2)[/tex] to a complex analytic function onto the whole plane, because you get singularities at [tex]z=\pm i[/tex]. Not surprisingly, the Taylor series around [tex]x=0[/tex] are not converging on larger open sets than [tex]]-1,1[[/tex]. The largest ball around origo so that f has complex analytic continuation there is [tex]B(0,1)[/tex].

Another example is the already mentioned [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=e^{-1/x^2}[/tex], [tex]f(0)=0[/tex]. This function has no complex analytic extension on any ball [tex]B(0,\epsilon)[/tex]. The only attempt [tex]f(z)=e^{-1/z^2}[/tex], [tex]f(0)=0[/tex] is not continuous at origo, since the limit [tex]\lim_{z\to 0}f(z)[/tex] does not exist.

...

I just started thinking about the possibility, that could there still be a real analytic function, that could not be extended to a complex analytic one, but actually I think that this is not possible. The reason is this: If

[tex]

\sum_{k=0}^{\infty} a_k (z-z_0)^k

[/tex]

converges for some [tex]z[/tex], then

[tex]

\sum_{k=0}^{\infty} a_k(\bar{z}-z_0)^k

[/tex]

converges for all [tex]|\bar{z}-z_0| < |z-z_0|[/tex]. So if there exists a real Taylor series, the Taylor series are also converging on nearby complex points.

Notice! This last conclusion is something that I realized right now while typing this message. It could be wrong. I would like to hear comments on it, even if it's right. So that I could be more sure... But if it is right, then it means that actually the question of function being real analytic can be settled completely by checking if the complex analytic extension exists!!

- #13

- 2,111

- 18

However, a function (on the real numbers) is analytic at x= a if and only if it can be extended to a function on the complex number plane in some neighborhood of a such that it complex-analytic and there are a number of criteria for that.

To my defense I must say that I'm not the only one who missed this, because there were still questions about how do you deal with the Cauchy-Riemann conditions when you only have a real function.

- #14

- 188

- 1

[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]

- #15

- 2,111

- 18

the question is , since Taylor series involve derivatives, for a non-analytic function such us Dirac delta d(x) and Heaviside function , could we understand the derivatives in the sense of distributions ??

A Dirac delta function is not a non-analytic function.

for example

[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]

hmhmhmhmhmhm......

[tex]

\sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0}

[/tex]

would be a distribution

[tex]

f \mapsto \sum_{k=0}^{\infty}\frac{u^n}{n!} f^{(n)}(x_0).

[/tex]

So if the test function is real analytic on the needed interval, then that's

[tex]

f\mapsto f(x_0 + u),

[/tex]

and

[tex]

\sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0} = \delta_{x_0 + u}.

[/tex]

One should take a closer look at the domains of the distributions, though.

- #16

- 2,111

- 18

[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]

You can interpret this formula being the same thing what I calculated there, so I think it's pretty much right. The minus sign in my calculation shows only because I was looking this thing a little bit differently, but its not a real difference.

Of course you have the usual problems there... like "do you know what distributions are?" on so on....

Share: