# Interval of convergence for Taylor series exp of 1/x^2

1. Jan 6, 2016

1. The problem statement, all variables and given/known data
The interval of convergence of the Taylor series expansion of 1/x^2, knowing that the interval of convergence of the Taylor series of 1/x centered at 1 is (0,2)

2. Relevant equations
If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}

3. The attempt at a solution
So it would seem that going from 1/x , with interval of convergence (0,2), to 1/x^2 one would get the new interval of convergence =(-√2,0)∪(0,√2). But this is not a single interval, and it would seem to have two centers.

2. Jan 6, 2016

### Staff: Mentor

Moved from Precalc section -- questions about Taylor series pertain to calculus.
The Taylor series for 1/x here is centered at 1, which means that the Taylor series will be in powers of (x - 1). The same will be true of the series for 1/x2. IOW, 1 will be in the middle of the interval. That is not the case with your assumed intervals.

3. Jan 6, 2016

Ah, thanks, Mark44. On one side that makes sense, but on the other side I am worried about the implication that my principle
"If I is the interval of convergence of the expansion of f(x) , and one substitutes a finite polynomial g(x) in for x to get the expansion of f(g(x)) in terms of x, then the interval of convergence changes from I to {x:g(x) in I}"
on the procedure of obtaining a Taylor series expansion for f(g(x)) by replacing g(x)=y into the expansion for f(y). On the face of it, the principle appears reasonable, but apparently from your answer this principle is not valid. If we allow a value k for x in the new interval of convergence which would make g(k) lie outside of the original one, then replacing g(k) into the original series would lead to divergence, no? For example, in this case, if the interval of convergence of 1/x2 is the same as that of 1/x, then 1.9 would be one of the values allowed for the expansion of 1/x2 , which would then allow (1.9)2 = 3.61 to be one of the values to be put into the expansion of 1/x, but 3.61 would then make the series for 1/x diverge. But my reasoning here must be faulty. I would be grateful for corrections.

4. Jan 7, 2016

### Staff: Mentor

Is this a theorem? When you say "my principle" it makes me think that this is something you're hypothesizing. If this is a theorem somewhere (and hence, proven), it's not something I recall seeing, but then the last time I taught calculus was bout 17 years ago.
I'm going to take it at face value that the Taylor series for 1/x, in powers of x - 1, has an interval of convergence of (0, 2) -- i.e., that the radius of this interval is 1. I would guess, without a whole lot of justification, that the Taylor series for 1/x2 is the same interval. If you choose x = 1.9, that's a value inside this interval, so the series for 1/x2 should converge for that choice of x. I don't see that it's relevant that x2 happens to be 3.61. That's not the x value.

5. Jan 7, 2016

You're right, it is not a proven theorem, but my (apparently false) hypothesis (my reasoning is explained below). I am actually more interested to know why this hypothesis is false than finding out this particular expansion, which I just selected as an example.
I was thinking that, using the method (gleaned from 3:08-3:15 of ) of substituting g(x) =y in the expansion of f(y), you would substitute g(x) everywhere for y:
that is, f(y) =..... for y in (a,b) translates to
f(g(x)) = ..... for g(x) in (a,b).

To take this specific example again,
1/y = Σ(y-1)n for y in (0,2)
1/x2=Σ(x2-1)n for x2 in (0,2)

6. Jan 7, 2016

### vela

Staff Emeritus
When it comes to convergence of the series, the relevant quantity is $x-1$, not $x$, because the series for 1/x is centered about x=1. If you replace $x$ with $x^2$ in the series for 1/x, it will converge if $-1 < x^2-1 < 1$. Note, however, that you no longer have a series centered about $x=1$.
\begin{align*}
\frac 1x &= \frac{1}{1+(x-1)} = 1-(x-1)+(x-1)^2-(x-1)^3+\cdots \\
\frac 1{x^2} &= 1-(x^2-1)+(x^2-1)^2-(x^2-1)^3+\cdots
\end{align*} To get the series for $1/x^2$ that's centered about x=1, you want to differentiate the series for 1/x.
$$\frac 1{x^2} = -\frac{d}{dx} \frac 1x = 1-2(x-1)+3(x-1)^2-\cdots.$$ This series converges on (0,2).

7. Jan 7, 2016

Super! Thanks, Vela. For the example I gave (1/x2), it is now clear how to get a new series as the old one with the same interval of convergence. However, suppose I do not insist that I end up with the same interval of convergence: then, would the substitution give me a valid expansion, that is, as in your answer,
?
with an interval of convergence of
?
(although wouldn't you have to take out zero from this interval?)

8. Jan 7, 2016

### HallsofIvy

It should be immediate that, since both $\frac{1}{x}$ and $\frac{1}{x^2}$ are analytic everywhere except at x= 0, the interval of convergence will go "up to" x= 0. The interval of convergence of either, for a Taylor series centered at x= a, is (0, 2a).

9. Jan 7, 2016

Thank you, HallsofIvy. So I now have two nice convergent series, each with a different interval of convergence, for the same function (out of an uncountably infinite number of series expansions, but perhaps only a countable number of them are "nice", whatever that would mean )

10. Jan 7, 2016

### vela

Staff Emeritus
Sure, why not? For any x such that $\lvert x^2-1 \rvert < 1$, the partial sums will be the same as those for the series for 1/u evaluated at $u=x^2-1$, and we know the latter series will converge. The thing is, the expansion isn't a Taylor series because each term isn't of the form $c_n(x-1)^n$. You can do some algebra to convert it to that form, but then you'd end up with a different series with a different interval of convergence.

x=0 isn't in the interval.

11. Jan 7, 2016

Thanks, vela! That puts the cap on it. All clear.