Taylor series of 1/(1+x^2).around x=1

In summary, the Taylor Series of f(x)=\frac{1}{1+x^2} around x0=0 is 1-x^2+x^4+...+(-1)^n x^{2n}+... for |x|<1. The Taylor Series of f(x)=\frac{1}{1+x^2} around x0=1 is not convergent in any neighborhood of x0=1 due to the singular points i and -i in the complex domain. The radius of convergence at x0=1 is \sqrt{2}, which is the distance from 1 to the nearest singular point. This can be explained by the fact that the radius of convergence is equal to the distance from the
  • #1
BobbyBear
162
1
I know that the Taylor Series of

[tex]

f(x)= \frac{1}{1+x^2}

[/tex]

around x0 = 0

is


[tex] 1 - x^2 + x^4 + ... + (-1)^n x^{2n} + ... [/tex] for |x|<1

But what I want is to construct the Taylor Series of

[tex]

f(x)= \frac{1}{1+x^2}

[/tex]

around x0 = 1. I tried working out the derivatives, but trying to find a general formula for the nth derivative is almost impossible (by the fourth derivative I was already suffocating:P). The thing is, I need this because I want to apply the ratio test (or any other) to find the Radius of Convergence of the series centered around x0 = 1. And the reason I want to do this, is because I really want to know if the radius of convergence is zero or not! And the reason I'm interested in this is, because even though

[tex]

f(x)= \frac{1}{1+x^2}

[/tex]

is indefinitely derivable at x=1, I want to know if it's analytic at this point or not! I want to know if its Taylor series centered around that point has a non-zero radius of convergence, and if it does, if the residue term given by Taylor's Theorem goes to zero as n goes to infinity within the radius of convergence of the series. The reason I picked out this example is because I know that although f doesn't seem to have a singularity at x=1 in the real domain, the reason the Taylor expansion of f around x0 = 0 stops converging at x=1 is because in the complex domain, x=i and -i are singular points, so I'm hopeful that the Taylor expansion around x0 = 1 will not converge at all in any neighbourhood of x0 = 1.

And then I will have found a function that is indefinitely derivable at a point and yet not analytic at that point, which is what I am searching for :) (the only examples of such functions I've ever come across are those whose derivatives at the point of consideration are all zero).
 
Physics news on Phys.org
  • #2
The function you gave is analytic on R. The radius of convergence at [tex]x_0=1[/tex] is [tex]\sqrt{2}[/tex].

Read this: http://en.wikipedia.org/wiki/Radius_of_convergence#Radii_of_convergence_in_complex_analysis"

And then I will have found a function that is indefinitely derivable at a point and yet not analytic at that point, which is what I am searching for :) (the only examples of such functions I've ever come across are those whose derivatives at the point of consideration are all zero).

Just add a polynomial (or analytic function) to a non-analytic function.
 
Last edited by a moderator:
  • #3
Thank you very much yyat :) now let me follow that link and digest your answer . . . I really don't know much at all about complex analysis:P
 
  • #4
Well, hat would be equivalent to find the power series representation about 0 of :
[tex]f(w)=\frac{1}{2-2w+w^{2}}[/tex]
 
  • #6
arildno said:
Well, hat would be equivalent to find the power series representation about 0 of :
[tex]f(w)=\frac{1}{2-2w+w^{2}}[/tex]

Should be [tex]f(w)=\frac{1}{2+2w+w^{2}}[/tex]
 
  • #7
Yyat, thank you very much, I read through the link you posted, I hope I understood most of what it said. In particular, the following result:

The radius of convergence of a power series f centered on a point a is equal to the distance from a to the nearest point where f cannot be defined in a way that makes it holomorphic.

although I had to read up on what holomorphic meant:P

Anyhow, I realize that that explains, for example, why the Taylor series of

[tex]

f(x)= \frac{1}{1+x^2}

[/tex]

about x=0 has radius of convergence 1, while the series centered around x=3 of the same function has radius of convergence 2, and that's because viewed as a function of complex variable, the radius of convergence is the distance from the point about which we construct the series to the points i and -i which are the (only) points at which f(x) is not holomorphic (since the function is not defined at those points, but is holomorphic at every other point, as it is continuous and the real functions u and v (real and imaginary parts of f) satisfy the Cauchy-Riemann equations at every point in which f is defined -which was quite cumbersome to verify I must say:P). That's why the series centered around x=1 or x=-1 would have radius of convergence [tex]
\sqrt{2}
[/tex], as that is the distance from 1 (or -1) to i and -i . . . as long as the distance is defined as d(z1,z2)=|z1-z2| (if not the radius of convergence would be given by the corresponding distance function?)

-----------------------------------

I also thank you for pointing me out (real) functions that are infinitely derivable at a point and yet not analytic, and whose derivatives at that point are not all zero (I'd never thought to simply add a polynomial to the classic examples which typically have all derivatives zero at the point in question, such as f=exp(-1/x2) for x<>0 and 0 for x=0). Coz I'd asked my teacher how can one tell whether an inifinately derivable function at some point is analytic or not (without having to construct the series and seeing if it converges and where it converges to . . . ), and he said that as long as it had at least one derivative that was not zero it would be analytic . . . I wasn't happy with that (as I could find no proof), but couldn't think of a counter example:P Now I can xD

Um, so is there any easy way to be able to look at a real function and tell whether it is analytic or not?

Once again thank you very much Yyat, your help is much appreciated :)
 
  • #8
Mathman, um, I think Arildno's version was correct? It's simply defining a new variable [tex]
w=x-1
[/tex]
no?

Thanks for posting :P
 
Last edited:
  • #9
Arildno, thank you :) Ya, I see that . . . you changed the variable. But I'm not sure what the point of doing that is . . . ?

Thank you for posting :)
 
Last edited:
  • #10
BobbyBear said:
Yyat, thank you very much, I read through the link you posted, I hope I understood most of what it said. In particular, the following result:
although I had to read up on what holomorphic meant:P

Anyhow, I realize that that explains, for example, why the Taylor series of

[tex]

f(x)= \frac{1}{1+x^2}

[/tex]

about x=0 has radius of convergence 1, while the series centered around x=3 of the same function has radius of convergence 2,
You mean [tex]\sqrt{10}[/tex]...

and that's because viewed as a function of complex variable, the radius of convergence is the distance from the point about which we construct the series to the points i and -i which are the (only) points at which f(x) is not holomorphic (since the function is not defined at those points, but is holomorphic at every other point, as it is continuous and the real functions u and v (real and imaginary parts of f) satisfy the Cauchy-Riemann equations at every point in which f is defined -which was quite cumbersome to verify I must say:P).

If you ever dive deeper into complex analysis, which is one of the most beautiful parts of math, you will learn that the elementary functions like exp, sin are holomorphic. Moreover sums, products and quotients of holomorphic functions are holomorphic, so all rational functions are holomorphic (where defined).

That's why the series centered around x=1 or x=-1 would have radius of convergence [tex]
\sqrt{2}
[/tex], as that is the distance from 1 (or -1) to i and -i . . . as long as the distance is defined as d(z1,z2)=|z1-z2| (if not the radius of convergence would be given by the corresponding distance function?)

This is more or less the only useful distance for complex numbers, since it's just given by the norm.

-----------------------------------

I also thank you for pointing me out (real) functions that are infinitely derivable at a point and yet not analytic, and whose derivatives at that point are not all zero (I'd never thought to simply add a polynomial to the classic examples which typically have all derivatives zero at the point in question, such as f=exp(-1/x2) for x<>0 and 0 for x=0). Coz I'd asked my teacher how can one tell whether an inifinately derivable function at some point is analytic or not (without having to construct the series and seeing if it converges and where it converges to . . . ), and he said that as long as it had at least one derivative that was not zero it would be analytic . . . I wasn't happy with that (as I could find no proof), but couldn't think of a counter example:P Now I can xD

Um, so is there any easy way to be able to look at a real function and tell whether it is analytic or not?

Showing that a function is analytic is in general not so easy unless that function is a simple combination of functions you already know are analytic. In the general case you will have to work with the definition directly.

It is sometimes easier to see that a function is not analytic using the following theorem, which is a consequence of http://en.wikipedia.org/wiki/Analytic_continuation" :

If f:U->R is analytic, U is connected, A is a subset of U with accumulation point in U (for example an interval) and f is zero on A, then f is zero in U.

Using this, it is easy to see that a http://en.wikipedia.org/wiki/Bump_function" can not be analytic.
 
Last edited by a moderator:
  • #11
Yyat,

yes, I meant [tex]
\sqrt{10}
[/tex], silly me:P

About:
If you ever dive deeper into complex analysis, which is one of the most beautiful parts of math, you will learn that the elementary functions like exp, sin are holomorphic. Moreover sums, products and quotients of holomorphic functions are holomorphic, so all rational functions are holomorphic (where defined).

Well, that certainly makes life a lot easier! Thank you again xD

I've still got to read up on that last theorem you mentioned of analytic continuation . . . I can't really afford to go too deep into complex analysis right now but I would like to when I have a chance.

Thank you for being so helpful :)

Bobby
 
  • #12
To find the taylor expansion it helps to use partial fractions:

1/1+z^2 = (1/2i)[1/(z-i)-1/(z+i)]

From here the n^th derivative, and hence the taylor expansion about any point (other than z=\pm i) is really easy to calculate.
 

1. What is the Taylor series for 1/(1+x^2) around x=1?

The Taylor series for 1/(1+x^2) around x=1 is:
f(x) = 1 + (x-1) - (x-1)^2 + (x-1)^3 - (x-1)^4 + ...

2. How do you find the coefficients in the Taylor series for 1/(1+x^2) around x=1?

The coefficients can be found using the formula:
a_n = f(n)(1)/n!
where n is the order of the derivative and f(n)(1) is the value of the nth derivative of f at x=1.

3. What is the radius of convergence for the Taylor series of 1/(1+x^2) around x=1?

The radius of convergence is the distance from the center of the series (x=1) to the point where the series no longer converges. In this case, the radius of convergence is 1, since the series only converges when |x-1| < 1.

4. What is the interval of convergence for the Taylor series of 1/(1+x^2) around x=1?

The interval of convergence is the range of values for x where the series converges. In this case, the interval of convergence is (-1, 3), since the series converges for all values of x within 1 unit from the center (x=1).

5. What is the significance of the Taylor series for 1/(1+x^2) around x=1?

The Taylor series for 1/(1+x^2) around x=1 is significant because it allows us to approximate the function for values of x outside of the interval of convergence. This can be useful in situations where it is difficult to find the exact value of the function, or when the function is complex and difficult to evaluate.

Similar threads

Replies
2
Views
1K
Replies
2
Views
1K
  • Calculus
Replies
3
Views
2K
  • Calculus
Replies
10
Views
1K
Replies
3
Views
1K
Replies
5
Views
12K
Replies
3
Views
978
Replies
17
Views
3K
Back
Top