Interval of convergence for Taylor series exp of 1/x^2

In summary: This is the same as saying that the Taylor series expansion for 1/x^2 is in the form f(x) = 1 + x^2 + x^4 + ... + x^n
  • #1
nomadreid
Gold Member
1,738
234

Homework Statement


The interval of convergence of the Taylor series expansion of 1/x^2, knowing that the interval of convergence of the Taylor series of 1/x centered at 1 is (0,2)

Homework Equations


If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}

The Attempt at a Solution


So it would seem that going from 1/x , with interval of convergence (0,2), to 1/x^2 one would get the new interval of convergence =(-√2,0)∪(0,√2). But this is not a single interval, and it would seem to have two centers.
 
Physics news on Phys.org
  • #2
Moved from Precalc section -- questions about Taylor series pertain to calculus.
nomadreid said:

Homework Statement


The interval of convergence of the Taylor series expansion of 1/x^2, knowing that the interval of convergence of the Taylor series of 1/x centered at 1 is (0,2)

Homework Equations


If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}

The Attempt at a Solution


So it would seem that going from 1/x , with interval of convergence (0,2), to 1/x^2 one would get the new interval of convergence =(-√2,0)∪(0,√2). But this is not a single interval, and it would seem to have two centers.
The Taylor series for 1/x here is centered at 1, which means that the Taylor series will be in powers of (x - 1). The same will be true of the series for 1/x2. IOW, 1 will be in the middle of the interval. That is not the case with your assumed intervals.
 
  • Like
Likes nomadreid
  • #3
Ah, thanks, Mark44. On one side that makes sense, but on the other side I am worried about the implication that my principle
"If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}"
on the procedure of obtaining a Taylor series expansion for f(g(x)) by replacing g(x)=y into the expansion for f(y). On the face of it, the principle appears reasonable, but apparently from your answer this principle is not valid. If we allow a value k for x in the new interval of convergence which would make g(k) lie outside of the original one, then replacing g(k) into the original series would lead to divergence, no? For example, in this case, if the interval of convergence of 1/x2 is the same as that of 1/x, then 1.9 would be one of the values allowed for the expansion of 1/x2 , which would then allow (1.9)2 = 3.61 to be one of the values to be put into the expansion of 1/x, but 3.61 would then make the series for 1/x diverge. But my reasoning here must be faulty. I would be grateful for corrections.
 
  • #4
nomadreid said:
Ah, thanks, Mark44. On one side that makes sense, but on the other side I am worried about the implication that my principle
"If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}"
Is this a theorem? When you say "my principle" it makes me think that this is something you're hypothesizing. If this is a theorem somewhere (and hence, proven), it's not something I recall seeing, but then the last time I taught calculus was bout 17 years ago.
nomadreid said:
on the procedure of obtaining a Taylor series expansion for f(g(x)) by replacing g(x)=y into the expansion for f(y). On the face of it, the principle appears reasonable, but apparently from your answer this principle is not valid. If we allow a value k for x in the new interval of convergence which would make g(k) lie outside of the original one, then replacing g(k) into the original series would lead to divergence, no? For example, in this case, if the interval of convergence of 1/x2 is the same as that of 1/x, then 1.9 would be one of the values allowed for the expansion of 1/x2 , which would then allow (1.9)2 = 3.61 to be one of the values to be put into the expansion of 1/x, but 3.61 would then make the series for 1/x diverge.
I'm going to take it at face value that the Taylor series for 1/x, in powers of x - 1, has an interval of convergence of (0, 2) -- i.e., that the radius of this interval is 1. I would guess, without a whole lot of justification, that the Taylor series for 1/x2 is the same interval. If you choose x = 1.9, that's a value inside this interval, so the series for 1/x2 should converge for that choice of x. I don't see that it's relevant that x2 happens to be 3.61. That's not the x value.
nomadreid said:
But my reasoning here must be faulty. I would be grateful for corrections.
 
  • Like
Likes nomadreid
  • #5
Mark44 said:
Is this a theorem? When you say "my principle" it makes me think that this is something you're hypothesizing.
You're right, it is not a proven theorem, but my (apparently false) hypothesis (my reasoning is explained below). I am actually more interested to know why this hypothesis is false than finding out this particular expansion, which I just selected as an example.
Mark44 said:
I don't see that it's relevant that x2 happens to be 3.61.
I was thinking that, using the method (gleaned from 3:08-3:15 of ) of substituting g(x) =y in the expansion of f(y), you would substitute g(x) everywhere for y:
that is, f(y) =... for y in (a,b) translates to
f(g(x)) = ... for g(x) in (a,b).

To take this specific example again,
1/y = Σ(y-1)n for y in (0,2)
1/x2=Σ(x2-1)n for x2 in (0,2)
 
  • #6
nomadreid said:
If we allow a value k for x in the new interval of convergence which would make g(k) lie outside of the original one, then replacing g(k) into the original series would lead to divergence, no? For example, in this case, if the interval of convergence of 1/x2 is the same as that of 1/x, then 1.9 would be one of the values allowed for the expansion of 1/x2 , which would then allow (1.9)2 = 3.61 to be one of the values to be put into the expansion of 1/x, but 3.61 would then make the series for 1/x diverge. But my reasoning here must be faulty. I would be grateful for corrections.
When it comes to convergence of the series, the relevant quantity is ##x-1##, not ##x##, because the series for 1/x is centered about x=1. If you replace ##x## with ##x^2## in the series for 1/x, it will converge if ##-1 < x^2-1 < 1##. Note, however, that you no longer have a series centered about ##x=1##.
\begin{align*}
\frac 1x &= \frac{1}{1+(x-1)} = 1-(x-1)+(x-1)^2-(x-1)^3+\cdots \\
\frac 1{x^2} &= 1-(x^2-1)+(x^2-1)^2-(x^2-1)^3+\cdots
\end{align*} To get the series for ##1/x^2## that's centered about x=1, you want to differentiate the series for 1/x.
$$\frac 1{x^2} = -\frac{d}{dx} \frac 1x = 1-2(x-1)+3(x-1)^2-\cdots.$$ This series converges on (0,2).
 
  • Like
Likes nomadreid
  • #7
Super! Thanks, Vela. For the example I gave (1/x2), it is now clear how to get a new series as the old one with the same interval of convergence. However, suppose I do not insist that I end up with the same interval of convergence: then, would the substitution give me a valid expansion, that is, as in your answer,
vela said:
=1−(x2−1)+(x2−1)2−(x2−1)3+⋯
?
with an interval of convergence of
vela said:
−1<x2−1<1
?
(although wouldn't you have to take out zero from this interval?)
 
  • #8
It should be immediate that, since both [itex]\frac{1}{x}[/itex] and [itex]\frac{1}{x^2}[/itex] are analytic everywhere except at x= 0, the interval of convergence will go "up to" x= 0. The interval of convergence of either, for a Taylor series centered at x= a, is (0, 2a).
 
  • Like
Likes nomadreid
  • #9
Thank you, HallsofIvy. So I now have two nice convergent series, each with a different interval of convergence, for the same function (out of an uncountably infinite number of series expansions, but perhaps only a countable number of them are "nice", whatever that would mean :woot: )
 
  • #10
nomadreid said:
Super! Thanks, Vela. For the example I gave (1/x2), it is now clear how to get a new series as the old one with the same interval of convergence. However, suppose I do not insist that I end up with the same interval of convergence: then, would the substitution give me a valid expansion
Sure, why not? For any x such that ##\lvert x^2-1 \rvert < 1##, the partial sums will be the same as those for the series for 1/u evaluated at ##u=x^2-1##, and we know the latter series will converge. The thing is, the expansion isn't a Taylor series because each term isn't of the form ##c_n(x-1)^n##. You can do some algebra to convert it to that form, but then you'd end up with a different series with a different interval of convergence.

(although wouldn't you have to take out zero from this interval?)
x=0 isn't in the interval.
 
  • Like
Likes nomadreid
  • #11
Thanks, vela! That puts the cap on it. All clear.
 

Similar threads

Back
Top