Under what conditions does a function have a power series representation?

B is some ball, and then take the real part of the Taylor series. The point is that the real part of a complex analytic function is also analytic, so the Taylor series for the real part is exactly the same as the Taylor series for the complex analytic function. Then you can use the complex analysis to determine when the original function satisfies the definition of being a real analytic function.In summary, a function has a power series representation if it is analytic at a given point or can be extended to a function on the complex number plane. The radius of convergence plays a key role in determining if the power series represents the original function, but even if the radius of convergence is non-zero, the series may not converge to the original function
  • #1
tgt
522
2
Under what conditions does a function have a power series representation?

I am looking for a theorem that says if a function satisfies these conditions then it has a power series representation. Or does all functions have a power series representation?
 
Physics news on Phys.org
  • #2
Is it that if a function has infinitely many derivatives at a point then it has the complete power series wrt that point.
 
  • #3
Yes, that is obviously true. But then the question is what you mean by a "power series representation" because that power series may not be equal to the original function.

A function (over the real numbers) is equal to its power series representation in some open interval about a point if and only if it is "analytic" at that point. That's really just the definition of "analytic" so it doesn't answer the original question. However, a function (on the real numbers) is analytic at x= a if and only if it can be extended to a function on the complex number plane in some neighborhood of a such that it complex-analytic and there are a number of criteria for that.
 
  • #4
I think you're looking for the notion of radius of convergence:

http://en.wikipedia.org/wiki/Radius_of_convergence

If a function has derivatives of all order at a point a, then we can form the Taylor series of the function about that point, which is the best bet for matching the function with a power series. If the radius of convergence of the power series thus obtaine is non zero, then the function can be reprensented by a power series about that point. If the radius of convergence is infinite, then the function equals its Taylor series expansion everywhere.
 
  • #5
quasar987 said:
I think you're looking for the notion of radius of convergence:

http://en.wikipedia.org/wiki/Radius_of_convergence

If a function has derivatives of all order at a point a, then we can form the Taylor series of the function about that point, which is the best bet for matching the function with a power series. If the radius of convergence of the power series thus obtaine is non zero, then the function can be reprensented by a power series about that point. If the radius of convergence is infinite, then the function equals its Taylor series expansion everywhere.

What if all derivatives exist at a point 'a' but the radius of convergence is 0? We still have a taylor series expansion?
 
  • #6
Yes, we have a Taylor's series, but if it does not converge for non-zero x, in what sense does it "represent" the original function or is an "expansion" of that function?

Actually, it is not simply a matter of "radius of convergence". It is quite possible for a function to have a Taylor's series at a given point which has non-zero radius of convergence but does NOT converge to the original function. Again, in what sense does that "represent" the original function?

For example, the function, f(x) defined by
[tex]f(x)= e^{-\frac{1}{x^2}}[/tex]
if x is non-zero while f(0)= 0. It is relatively easy to show that f is infinitely differentialble for all x and f and all of its derivatives are equal to 0 at x= 0. The Taylor's series for f at x= 0 is identically equal to 0, which converges for all x, while f itself in 0 only at x= 0.

(Edited so I can pretend I didn't make the mistake of saying "1/x2" rather than "-1/x2"! Thanks maze.)
 
Last edited by a moderator:
  • #7
[tex]f\left(x\right)=e^{-\frac{1}{x^2}}[/tex]

Functions similar to this are used extensively as mollifiers ("bump functions") to create a smooth function with compact support.

http://en.wikipedia.org/wiki/Non-analytic_smooth_function

When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

Non-analytic smooth functions are not like this - they could be zero-zero-zero-zero- and then go off and turn into a parabola, then go turn into an exponential, then go do whatever they like.
 
Last edited:
  • #8
so I'm sorry if this was answered and i didn't understand but how do i know if a function equals its taylor series everywhere? is there a table somewhere? is there some rules about compositions? can i just post my function here and you tell me if its?
 
  • #9
I think compositions of functions that equal their taylor series will still equal their taylor series. Not 100% sure.

Note that with the above example there is some division by 0 going on with exp(-1/x^2), so technically you have to make it piecewise to add in the point 0 at 0 so it is not a simple composition.
 
  • #10
well apparently the function under consideration should satisfy the riemann-cauchy equations but how do i apply them to functions that don't have complex parts?
 
  • #11
maze said:
When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

Thats a really cool analogy :cool:
 
  • #12
The definition of real analytic functions (which is that the function is its Taylor series) is not very useful, because there is no immediate way to check when a function satisfies the definition. The definition of complex analytic functions (which is that the function is continuously real differentiable, and satisfies Cauchy-Riemann equations) however is very useful, because there is a theorem that says that if [tex]f:B(z_0,r)\to\mathbb{C}[/tex] is complex analytic, where [tex]B(z_0,r)\subset\mathbb{C}[/tex] is some ball, then [tex]f[/tex] can be written as a Taylor series in this ball.

The complex analysis then gives the obvious way to deal with Taylor series of the real analytic functions too. When you are given a function [tex]f:[a,b]\to\mathbb{R}[/tex], extend it to a complex analytic function [tex]f:B((a+b)/2, (b-a)/2)\to\mathbb{C}[/tex], you know it has Taylor series representation, then restrict the Taylor series back to the real line.

Suppose you want to know that Taylor series of [tex]\log:]0,2[\to\mathbb{R}[/tex] around [tex]x=1[/tex] converges towards the logarithm. We know that [tex]\log:B(1,1)\to\mathbb{C}[/tex], [tex]\log(z)=\log(|z|) + i \textrm{Arg}(z)[/tex], has the Taylor series representation, so the proof is done.

What happens if the function cannot be extended to a complex analytic function? Then the Taylor series are not converging right.

For example you cannot extend [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=1/(1+x^2)[/tex] to a complex analytic function onto the whole plane, because you get singularities at [tex]z=\pm i[/tex]. Not surprisingly, the Taylor series around [tex]x=0[/tex] are not converging on larger open sets than [tex]]-1,1[[/tex]. The largest ball around origo so that f has complex analytic continuation there is [tex]B(0,1)[/tex].

Another example is the already mentioned [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=e^{-1/x^2}[/tex], [tex]f(0)=0[/tex]. This function has no complex analytic extension on any ball [tex]B(0,\epsilon)[/tex]. The only attempt [tex]f(z)=e^{-1/z^2}[/tex], [tex]f(0)=0[/tex] is not continuous at origo, since the limit [tex]\lim_{z\to 0}f(z)[/tex] does not exist.

...

I just started thinking about the possibility, that could there still be a real analytic function, that could not be extended to a complex analytic one, but actually I think that this is not possible. The reason is this: If

[tex]
\sum_{k=0}^{\infty} a_k (z-z_0)^k
[/tex]

converges for some [tex]z[/tex], then

[tex]
\sum_{k=0}^{\infty} a_k(\bar{z}-z_0)^k
[/tex]

converges for all [tex]|\bar{z}-z_0| < |z-z_0|[/tex]. So if there exists a real Taylor series, the Taylor series are also converging on nearby complex points.

Notice! This last conclusion is something that I realized right now while typing this message. It could be wrong. I would like to hear comments on it, even if it's right. So that I could be more sure... But if it is right, then it means that actually the question of function being real analytic can be settled completely by checking if the complex analytic extension exists!
 
  • #13
Oh! I only now checked more carefully the thread and noticed that HallsofIvy was saying this already here:

HallsofIvy said:
However, a function (on the real numbers) is analytic at x= a if and only if it can be extended to a function on the complex number plane in some neighborhood of a such that it complex-analytic and there are a number of criteria for that.

To my defense I must say that I'm not the only one who missed this, because there were still questions about how do you deal with the Cauchy-Riemann conditions when you only have a real function.
 
  • #14
the question is , since Taylor series involve derivatives, for a non-analytic function such us Dirac delta d(x) and Heaviside function , could we understand the derivatives in the sense of distributions ?? for example

[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]
 
  • #15
mhill said:
the question is , since Taylor series involve derivatives, for a non-analytic function such us Dirac delta d(x) and Heaviside function , could we understand the derivatives in the sense of distributions ??

A Dirac delta function is not a non-analytic function.

for example

[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]

hmhmhmhmhmhm...

[tex]
\sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0}
[/tex]

would be a distribution

[tex]
f \mapsto \sum_{k=0}^{\infty}\frac{u^n}{n!} f^{(n)}(x_0).
[/tex]

So if the test function is real analytic on the needed interval, then that's

[tex]
f\mapsto f(x_0 + u),
[/tex]

and

[tex]
\sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0} = \delta_{x_0 + u}.
[/tex]

One should take a closer look at the domains of the distributions, though.
 
  • #16
mhill said:
[tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]

You can interpret this formula being the same thing what I calculated there, so I think it's pretty much right. The minus sign in my calculation shows only because I was looking this thing a little bit differently, but its not a real difference.

Of course you have the usual problems there... like "do you know what distributions are?" on so on...
 

What is a power series representation?

A power series representation is a mathematical expression that represents a function as an infinite sum of terms involving powers of a variable. It is often used to approximate functions and evaluate them at different points.

What are the conditions for a function to have a power series representation?

A function must be infinitely differentiable and have a finite radius of convergence in order to have a power series representation. This means that the function must have derivatives of all orders that exist and the series must converge within a certain range of values for the variable.

How do you determine the radius of convergence for a power series representation?

The radius of convergence of a power series can be determined by using the ratio test. This involves taking the limit of the absolute value of the ratio of consecutive coefficients in the series. The radius of convergence is the distance from the center of the series to the nearest point at which the series diverges.

Can a function have more than one power series representation?

No, a function can only have one power series representation. This is because the coefficients in the series are uniquely determined by the function and the center of the series.

What are the practical applications of power series representations?

Power series representations are used in many areas of science and engineering, including physics, chemistry, and computer science. They are particularly useful for approximating and evaluating functions that are difficult to compute using other methods. They are also used in calculus to find derivatives and integrals of functions.

Similar threads

Replies
2
Views
762
Replies
2
Views
2K
  • Calculus
Replies
7
Views
2K
  • Calculus
Replies
1
Views
1K
Replies
15
Views
2K
Replies
3
Views
1K
Replies
9
Views
1K
Replies
5
Views
1K
Back
Top