Does every continuous function has a power series expansion on a closed interval


by kof9595995
Tags: continuous, expansion, function, interval, power, series
kof9595995
kof9595995 is offline
#1
Oct31-09, 03:37 AM
P: 679
By Weierstrass approximation theorem, it seems to be obvious that every continuous function has a power expansion on a closed interval, but I'm not 100% sure about this. Is this genuinely true or there're some counterexamples?
Phys.Org News Partner Science news on Phys.org
Cougars' diverse diet helped them survive the Pleistocene mass extinction
Cyber risks can cause disruption on scale of 2008 crisis, study says
Mantis shrimp stronger than airplanes
g_edgar
g_edgar is offline
#2
Oct31-09, 07:36 AM
P: 608
This is incorrect. The sum of a power series is an analytic function. So, for example, the function |x| on the interval [-1,1] is not the sum of a power series, since it is not even differentiable at x=0.
kof9595995
kof9595995 is offline
#3
Oct31-09, 10:01 AM
P: 679
But by Weierstrass you can always construct a sequence of polynomials uniformly converging to it, isn't it? I thought differentiability is required only if you want to do something like Taylor series expansion.

kof9595995
kof9595995 is offline
#4
Oct31-09, 11:21 AM
P: 679

Does every continuous function has a power series expansion on a closed interval


Em, now I see what you mean. As long as a function can be represented by a power series, it can be differentiated as many times as you want, so |x| can not be represented by power series.
But by Weierstrass theorem you can always construct a sequence of polynomials uniformly converging to it, so in what sense is this different with saying that it can be written as a power series? I'm very confused
LCKurtz
LCKurtz is offline
#5
Oct31-09, 07:43 PM
HW Helper
Thanks
PF Gold
LCKurtz's Avatar
P: 7,221
Quote Quote by kof9595995 View Post
Em, now I see what you mean. As long as a function can be represented by a power series, it can be differentiated as many times as you want, so |x| can not be represented by power series.
But by Weierstrass theorem you can always construct a sequence of polynomials uniformly converging to it, so in what sense is this different with saying that it can be written as a power series? I'm very confused
A sequence of polynomials converging to a continuous function may be something other than a sequence of partial sums of a Taylor series. A sequence of Bernstein polynomials converging to f is a good example.

Also there exist infinitely smooth functions that aren't represented by their Taylor series. Consider
[tex]f(x) = e^{- \frac 1 {x^2}},\ x \neq 0[/tex]
and f(0) = 0. This has derivatives of all orders equal to 0 when x = 0 so its Taylor series sums to 0, not to f(x) except at 0. Yet on any finite interval it can be uniformly approximated by polynomials.
kof9595995
kof9595995 is offline
#6
Nov1-09, 07:36 AM
P: 679
Quote Quote by LCKurtz View Post
A sequence of polynomials converging to a continuous function may be something other than a sequence of partial sums of a Taylor series. A sequence of Bernstein polynomials converging to f is a good example.
Right, I know there needn't to be a Taylor series, but why there can be no infinite power series at all in some cases, like |x| around x=0.
LCKurtz
LCKurtz is offline
#7
Nov1-09, 06:57 PM
HW Helper
Thanks
PF Gold
LCKurtz's Avatar
P: 7,221
Quote Quote by kof9595995 View Post
Right, I know there needn't to be a Taylor series, but why there can be no infinite power series at all in some cases, like |x| around x=0.
Didn't you read G_Edgar's explanation? Another way you might see it is that if f(x) is expressible as a series like:

[tex]f(x) = \sum_{n=0}^\infty a_nx^n[/tex]

with a positive radius of convergence then f must be as differentiable as the right side and, moreover, differentiating both sides shows the only choice for the coefficients are the Taylor coefficients. So Taylor series is the only game in town for a power series.
kof9595995
kof9595995 is offline
#8
Nov1-09, 10:45 PM
P: 679
Quote Quote by LCKurtz View Post
Didn't you read G_Edgar's explanation? Another way you might see it is that if f(x) is expressible as a series like:

[tex]f(x) = \sum_{n=0}^\infty a_nx^n[/tex]

with a positive radius of convergence then f must be as differentiable as the right side and, moreover, differentiating both sides shows the only choice for the coefficients are the Taylor coefficients. So Taylor series is the only game in town for a power series.
Well, I thought any polynomials can be written as a power series; like Bernstein polynomials, I thought if you expand each term, and collect the terms with same power, then you get a power series. Is it wrong because we can't switch the order of terms in infinite series without proving absolute convergence, or is there some other reason?
And, why in the Bernstein case we don't need the differentiability condition?
Please bear with if the question is too naive; I'm not majoring in math so I'm really not good at these analysis stuff.
Thanks.
Office_Shredder
Office_Shredder is offline
#9
Nov1-09, 11:53 PM
Mentor
P: 4,499
The question is: given a sequence of Bernstein polynomials, do they actually converge to a power series? In general the coefficients will change from one polynomial to the next
kof9595995
kof9595995 is offline
#10
Nov2-09, 09:05 AM
P: 679
Quote Quote by Office_Shredder View Post
The question is: given a sequence of Bernstein polynomials, do they actually converge to a power series? In general the coefficients will change from one polynomial to the next
So you mean if I try to rewrite Bernstein polynomials in terms of powers of x, then as the Bernstein polynomials approach the function, the coefficients of x^n won't converge, but probably oscillate or blows up instead, right?
But for those can be written in power series, the coefficients must converge to what we collect from Bernstein polynomials, is that right?
LCKurtz
LCKurtz is offline
#11
Nov2-09, 10:17 AM
HW Helper
Thanks
PF Gold
LCKurtz's Avatar
P: 7,221
Quote Quote by kof9595995 View Post
So you mean if I try to rewrite Bernstein polynomials in terms of powers of x, then as the Bernstein polynomials approach the function, the coefficients of x^n won't converge, but probably oscillate or blows up instead, right?
But for those can be written in power series, the coefficients must converge to what we collect from Bernstein polynomials, is that right?
I have never looked at those two specific questions, so I don't know for sure. I would expect that the answer to your first question is yes and the second is no.


Register to reply

Related Discussions
How to do expansion as power series of any random function?? Advanced Physics Homework 1
Power series expansion of a function of x Calculus & Beyond Homework 2
Existence of the integral of an increasing function over a closed interval Calculus & Beyond Homework 1
[SOLVED] Finding the Max Value of a Function on a closed interval Calculus & Beyond Homework 7
Power series interval Calculus & Beyond Homework 2