# Taylor series of 1/(1+x^2) . . . around x=1!

by BobbyBear
Tags: analytic, convergence, expansion, taylor
 P: 162 I know that the Taylor Series of $$f(x)= \frac{1}{1+x^2}$$ around x0 = 0 is $$1 - x^2 + x^4 + ... + (-1)^n x^{2n} + ...$$ for |x|<1 But what I want is to construct the Taylor Series of $$f(x)= \frac{1}{1+x^2}$$ around x0 = 1. I tried working out the derivatives, but trying to find a general formula for the nth derivative is almost impossible (by the fourth derivative I was already suffocating:P). The thing is, I need this because I want to apply the ratio test (or any other) to find the Radius of Convergence of the series centered around x0 = 1. And the reason I want to do this, is because I really want to know if the radius of convergence is zero or not! And the reason I'm interested in this is, because even though $$f(x)= \frac{1}{1+x^2}$$ is indefinitely derivable at x=1, I want to know if it's analytic at this point or not! I want to know if its Taylor series centered around that point has a non-zero radius of convergence, and if it does, if the residue term given by Taylor's Theorem goes to zero as n goes to infinity within the radius of convergence of the series. The reason I picked out this example is because I know that although f doesn't seem to have a singularity at x=1 in the real domain, the reason the Taylor expansion of f around x0 = 0 stops converging at x=1 is because in the complex domain, x=i and -i are singular points, so I'm hopeful that the Taylor expansion around x0 = 1 will not converge at all in any neighbourhood of x0 = 1. And then I will have found a function that is indefinitely derivable at a point and yet not analytic at that point, which is what I am searching for :) (the only examples of such functions I've ever come across are those whose derivatives at the point of consideration are all zero).
P: 321
The function you gave is analytic on R. The radius of convergence at $$x_0=1$$ is $$\sqrt{2}$$.

 And then I will have found a function that is indefinitely derivable at a point and yet not analytic at that point, which is what I am searching for :) (the only examples of such functions I've ever come across are those whose derivatives at the point of consideration are all zero).
Just add a polynomial (or analytic function) to a non-analytic function.
 P: 162 Thank you very much yyat :) now let me follow that link and digest your answer . . . I really don't know much at all about complex analysis:P
 Sci Advisor HW Helper PF Gold P: 12,016 Taylor series of 1/(1+x^2) . . . around x=1! Well, hat would be equivalent to find the power series representation about 0 of : $$f(w)=\frac{1}{2-2w+w^{2}}$$
 P: 813
P: 6,057
 Quote by arildno Well, hat would be equivalent to find the power series representation about 0 of : $$f(w)=\frac{1}{2-2w+w^{2}}$$
Should be $$f(w)=\frac{1}{2+2w+w^{2}}$$
P: 162
Yyat, thank you very much, I read through the link you posted, I hope I understood most of what it said. In particular, the following result:

 The radius of convergence of a power series f centered on a point a is equal to the distance from a to the nearest point where f cannot be defined in a way that makes it holomorphic.

Anyhow, I realise that that explains, for example, why the Taylor series of

$$f(x)= \frac{1}{1+x^2}$$

about x=0 has radius of convergence 1, while the series centered around x=3 of the same function has radius of convergence 2, and that's because viewed as a function of complex variable, the radius of convergence is the distance from the point about which we construct the series to the points i and -i which are the (only) points at which f(x) is not holomorphic (since the function is not defined at those points, but is holomorphic at every other point, as it is continuous and the real functions u and v (real and imaginary parts of f) satisfy the Cauchy-Riemann equations at every point in which f is defined -which was quite cumbersome to verify I must say:P). That's why the series centered around x=1 or x=-1 would have radius of convergence $$\sqrt{2}$$, as that is the distance from 1 (or -1) to i and -i . . . as long as the distance is defined as d(z1,z2)=|z1-z2| (if not the radius of convergence would be given by the corresponding distance function?)

-----------------------------------

I also thank you for pointing me out (real) functions that are infinitely derivable at a point and yet not analytic, and whose derivatives at that point are not all zero (I'd never thought to simply add a polynomial to the classic examples which typically have all derivatives zero at the point in question, such as f=exp(-1/x2) for x<>0 and 0 for x=0). Coz I'd asked my teacher how can one tell whether an inifinately derivable function at some point is analytic or not (without having to construct the series and seeing if it converges and where it converges to . . . ), and he said that as long as it had at least one derivative that was not zero it would be analytic . . . I wasn't happy with that (as I could find no proof), but couldn't think of a counter example:P Now I can xD

Um, so is there any easy way to be able to look at a real function and tell whether it is analytic or not?

Once again thank you very much Yyat, your help is much appreciated :)
 P: 162 Mathman, um, I think Arildno's version was correct? It's simply defining a new variable $$w=x-1$$ no? Thanks for posting :P
 P: 162 Arildno, thank you :) Ya, I see that . . . you changed the variable. But I'm not sure what the point of doing that is . . . ? Thank you for posting :)
P: 321
 Quote by BobbyBear Yyat, thank you very much, I read through the link you posted, I hope I understood most of what it said. In particular, the following result: although I had to read up on what holomorphic meant:P Anyhow, I realise that that explains, for example, why the Taylor series of $$f(x)= \frac{1}{1+x^2}$$ about x=0 has radius of convergence 1, while the series centered around x=3 of the same function has radius of convergence 2,
You mean $$\sqrt{10}$$...

 and that's because viewed as a function of complex variable, the radius of convergence is the distance from the point about which we construct the series to the points i and -i which are the (only) points at which f(x) is not holomorphic (since the function is not defined at those points, but is holomorphic at every other point, as it is continuous and the real functions u and v (real and imaginary parts of f) satisfy the Cauchy-Riemann equations at every point in which f is defined -which was quite cumbersome to verify I must say:P).
If you ever dive deeper into complex analysis, which is one of the most beautiful parts of math, you will learn that the elementary functions like exp, sin are holomorphic. Moreover sums, products and quotients of holomorphic functions are holomorphic, so all rational functions are holomorphic (where defined).

 That's why the series centered around x=1 or x=-1 would have radius of convergence $$\sqrt{2}$$, as that is the distance from 1 (or -1) to i and -i . . . as long as the distance is defined as d(z1,z2)=|z1-z2| (if not the radius of convergence would be given by the corresponding distance function?)
This is more or less the only useful distance for complex numbers, since it's just given by the norm.

 ----------------------------------- I also thank you for pointing me out (real) functions that are infinitely derivable at a point and yet not analytic, and whose derivatives at that point are not all zero (I'd never thought to simply add a polynomial to the classic examples which typically have all derivatives zero at the point in question, such as f=exp(-1/x2) for x<>0 and 0 for x=0). Coz I'd asked my teacher how can one tell whether an inifinately derivable function at some point is analytic or not (without having to construct the series and seeing if it converges and where it converges to . . . ), and he said that as long as it had at least one derivative that was not zero it would be analytic . . . I wasn't happy with that (as I could find no proof), but couldn't think of a counter example:P Now I can xD Um, so is there any easy way to be able to look at a real function and tell whether it is analytic or not?
Showing that a function is analytic is in general not so easy unless that function is a simple combination of functions you already know are analytic. In the general case you will have to work with the definition directly.

It is sometimes easier to see that a function is not analytic using the following theorem, which is a consequence of analytic continuation:

If f:U->R is analytic, U is connected, A is a subset of U with accumulation point in U (for example an interval) and f is zero on A, then f is zero in U.

Using this, it is easy to see that a bump function can not be analytic.
P: 162
Yyat,

yes, I meant $$\sqrt{10}$$, silly me:P