Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Under what conditions does a function have a power series representation?

  1. Jun 5, 2008 #1


    User Avatar

    Under what conditions does a function have a power series representation?

    I am looking for a theorem that says if a function satisfies these conditions then it has a power series representation. Or does all functions have a power series representation?
  2. jcsd
  3. Jun 5, 2008 #2


    User Avatar

    Is it that if a function has infinitely many derivatives at a point then it has the complete power series wrt that point.
  4. Jun 5, 2008 #3


    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, that is obviously true. But then the question is what you mean by a "power series representation" because that power series may not be equal to the original function.

    A function (over the real numbers) is equal to its power series representation in some open interval about a point if and only if it is "analytic" at that point. That's really just the definition of "analytic" so it doesn't answer the original question. However, a function (on the real numbers) is analytic at x= a if and only if it can be extended to a function on the complex number plane in some neighborhood of a such that it complex-analytic and there are a number of criteria for that.
  5. Jun 5, 2008 #4


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I think you're looking for the notion of radius of convergence:


    If a function has derivatives of all order at a point a, then we can form the Taylor series of the function about that point, which is the best bet for matching the function with a power series. If the radius of convergence of the power series thus obtaine is non zero, then the function can be reprensented by a power series about that point. If the radius of convergence is infinite, then the function equals its Taylor series expansion everywhere.
  6. Jun 5, 2008 #5


    User Avatar

    What if all derivatives exist at a point 'a' but the radius of convergence is 0? We still have a taylor series expansion?
  7. Jun 6, 2008 #6


    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, we have a Taylor's series, but if it does not converge for non-zero x, in what sense does it "represent" the original function or is an "expansion" of that function?

    Actually, it is not simply a matter of "radius of convergence". It is quite possible for a function to have a Taylor's series at a given point which has non-zero radius of convergence but does NOT converge to the original function. Again, in what sense does that "represent" the original function?

    For example, the function, f(x) defined by
    [tex]f(x)= e^{-\frac{1}{x^2}}[/tex]
    if x is non-zero while f(0)= 0. It is relatively easy to show that f is infinitely differentialble for all x and f and all of its derivatives are equal to 0 at x= 0. The Taylor's series for f at x= 0 is identically equal to 0, which converges for all x, while f itself in 0 only at x= 0.

    (Edited so I can pretend I didn't make the mistake of saying "1/x2" rather than "-1/x2"! Thanks maze.)
    Last edited: Jun 12, 2008
  8. Jun 6, 2008 #7

    Functions similar to this are used extensively as mollifiers ("bump functions") to create a smooth function with compact support.


    When it comes to analytic functions, I always think of Douglas Adam's idea in the Hitchhikers Guide to the Galaxy, where if you study a small piece of anything (such as a piece of cake) hard enough, you could deduce the entirety of the universe from it. All of the information in the whole analytic function is contained in the way it is curving around a single point.

    Non-analytic smooth functions are not like this - they could be zero-zero-zero-zero- and then go off and turn into a parabola, then go turn into an exponential, then go do whatever they like.
    Last edited: Jun 6, 2008
  9. Jun 6, 2008 #8
    so i'm sorry if this was answered and i didn't understand but how do i know if a function equals its taylor series everywhere? is there a table somewhere? is there some rules about compositions? can i just post my function here and you tell me if its?
  10. Jun 6, 2008 #9
    I think compositions of functions that equal their taylor series will still equal their taylor series. Not 100% sure.

    Note that with the above example there is some division by 0 going on with exp(-1/x^2), so technically you have to make it piecewise to add in the point 0 at 0 so it is not a simple composition.
  11. Jun 7, 2008 #10
    well apparently the function under consideration should satisfy the riemann-cauchy equations but how do i apply them to functions that don't have complex parts?
  12. Jun 7, 2008 #11
    Thats a really cool analogy :cool:
  13. Jun 7, 2008 #12
    The definition of real analytic functions (which is that the function is its Taylor series) is not very useful, because there is no immediate way to check when a function satisfies the definition. The definition of complex analytic functions (which is that the function is continuously real differentiable, and satisfies Cauchy-Riemann equations) however is very useful, because there is a theorem that says that if [tex]f:B(z_0,r)\to\mathbb{C}[/tex] is complex analytic, where [tex]B(z_0,r)\subset\mathbb{C}[/tex] is some ball, then [tex]f[/tex] can be written as a Taylor series in this ball.

    The complex analysis then gives the obvious way to deal with Taylor series of the real analytic functions too. When you are given a function [tex]f:[a,b]\to\mathbb{R}[/tex], extend it to a complex analytic function [tex]f:B((a+b)/2, (b-a)/2)\to\mathbb{C}[/tex], you know it has Taylor series representation, then restrict the Taylor series back to the real line.

    Suppose you want to know that Taylor series of [tex]\log:]0,2[\to\mathbb{R}[/tex] around [tex]x=1[/tex] converges towards the logarithm. We know that [tex]\log:B(1,1)\to\mathbb{C}[/tex], [tex]\log(z)=\log(|z|) + i \textrm{Arg}(z)[/tex], has the Taylor series representation, so the proof is done.

    What happens if the function cannot be extended to a complex analytic function? Then the Taylor series are not converging right.

    For example you cannot extend [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=1/(1+x^2)[/tex] to a complex analytic function onto the whole plane, because you get singularities at [tex]z=\pm i[/tex]. Not surprisingly, the Taylor series around [tex]x=0[/tex] are not converging on larger open sets than [tex]]-1,1[[/tex]. The largest ball around origo so that f has complex analytic continuation there is [tex]B(0,1)[/tex].

    Another example is the already mentioned [tex]f:\mathbb{R}\to\mathbb{R}[/tex], [tex]f(x)=e^{-1/x^2}[/tex], [tex]f(0)=0[/tex]. This function has no complex analytic extension on any ball [tex]B(0,\epsilon)[/tex]. The only attempt [tex]f(z)=e^{-1/z^2}[/tex], [tex]f(0)=0[/tex] is not continuous at origo, since the limit [tex]\lim_{z\to 0}f(z)[/tex] does not exist.


    I just started thinking about the possibility, that could there still be a real analytic function, that could not be extended to a complex analytic one, but actually I think that this is not possible. The reason is this: If

    \sum_{k=0}^{\infty} a_k (z-z_0)^k

    converges for some [tex]z[/tex], then

    \sum_{k=0}^{\infty} a_k(\bar{z}-z_0)^k

    converges for all [tex]|\bar{z}-z_0| < |z-z_0|[/tex]. So if there exists a real Taylor series, the Taylor series are also converging on nearby complex points.

    Notice! This last conclusion is something that I realized right now while typing this message. It could be wrong. I would like to hear comments on it, even if it's right. So that I could be more sure... But if it is right, then it means that actually the question of function being real analytic can be settled completely by checking if the complex analytic extension exists!!
  14. Jun 7, 2008 #13
    Oh! I only now checked more carefully the thread and noticed that HallsofIvy was saying this already here:

    To my defense I must say that I'm not the only one who missed this, because there were still questions about how do you deal with the Cauchy-Riemann conditions when you only have a real function.
  15. Jun 12, 2008 #14
    the question is , since Taylor series involve derivatives, for a non-analytic function such us Dirac delta d(x) and Heaviside function , could we understand the derivatives in the sense of distributions ?? for example

    [tex] \delta (x) = \sum_{n=0}^{\infty} (n!)^{-1} D^{n} \delta (0) x^{n} [/tex]
  16. Jun 12, 2008 #15
    A Dirac delta function is not a non-analytic function.


    \sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0}

    would be a distribution

    f \mapsto \sum_{k=0}^{\infty}\frac{u^n}{n!} f^{(n)}(x_0).

    So if the test function is real analytic on the needed interval, then that's

    f\mapsto f(x_0 + u),


    \sum_{k=0}^{\infty}\frac{(-u)^n}{n!} \delta^{(n)}_{x_0} = \delta_{x_0 + u}.

    One should take a closer look at the domains of the distributions, though.
  17. Jun 12, 2008 #16
    You can interpret this formula being the same thing what I calculated there, so I think it's pretty much right. The minus sign in my calculation shows only because I was looking this thing a little bit differently, but its not a real difference.

    Of course you have the usual problems there... like "do you know what distributions are?" on so on....
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Under what conditions does a function have a power series representation?