I know that the Taylor Series of [tex] f(x)= \frac{1}{1+x^2} [/tex] around x_{0} = 0 is [tex] 1 - x^2 + x^4 + ... + (-1)^n x^{2n} + ... [/tex] for |x|<1 But what I want is to construct the Taylor Series of [tex] f(x)= \frac{1}{1+x^2} [/tex] around x_{0} = 1. I tried working out the derivatives, but trying to find a general formula for the n^{th} derivative is almost impossible (by the fourth derivative I was already suffocating:P). The thing is, I need this because I want to apply the ratio test (or any other) to find the Radius of Convergence of the series centered around x_{0} = 1. And the reason I want to do this, is because I really want to know if the radius of convergence is zero or not! And the reason I'm interested in this is, because even though [tex] f(x)= \frac{1}{1+x^2} [/tex] is indefinitely derivable at x=1, I want to know if it's analytic at this point or not! I want to know if its Taylor series centered around that point has a non-zero radius of convergence, and if it does, if the residue term given by Taylor's Theorem goes to zero as n goes to infinity within the radius of convergence of the series. The reason I picked out this example is because I know that although f doesn't seem to have a singularity at x=1 in the real domain, the reason the Taylor expansion of f around x_{0} = 0 stops converging at x=1 is because in the complex domain, x=i and -i are singular points, so I'm hopeful that the Taylor expansion around x_{0} = 1 will not converge at all in any neighbourhood of x_{0} = 1. And then I will have found a function that is indefinitely derivable at a point and yet not analytic at that point, which is what I am searching for :) (the only examples of such functions I've ever come across are those whose derivatives at the point of consideration are all zero).
The function you gave is analytic on R. The radius of convergence at [tex]x_0=1[/tex] is [tex]\sqrt{2}[/tex]. Read this: http://en.wikipedia.org/wiki/Radius_of_convergence#Radii_of_convergence_in_complex_analysis Just add a polynomial (or analytic function) to a non-analytic function.
Thank you very much yyat :) now let me follow that link and digest your answer . . . I really don't know much at all about complex analysis:P
Well, hat would be equivalent to find the power series representation about 0 of : [tex]f(w)=\frac{1}{2-2w+w^{2}}[/tex]
Yyat, thank you very much, I read through the link you posted, I hope I understood most of what it said. In particular, the following result: although I had to read up on what holomorphic meant:P Anyhow, I realise that that explains, for example, why the Taylor series of [tex] f(x)= \frac{1}{1+x^2} [/tex] about x=0 has radius of convergence 1, while the series centered around x=3 of the same function has radius of convergence 2, and that's because viewed as a function of complex variable, the radius of convergence is the distance from the point about which we construct the series to the points i and -i which are the (only) points at which f(x) is not holomorphic (since the function is not defined at those points, but is holomorphic at every other point, as it is continuous and the real functions u and v (real and imaginary parts of f) satisfy the Cauchy-Riemann equations at every point in which f is defined -which was quite cumbersome to verify I must say:P). That's why the series centered around x=1 or x=-1 would have radius of convergence [tex] \sqrt{2} [/tex], as that is the distance from 1 (or -1) to i and -i . . . as long as the distance is defined as d(z_{1},z_{2})=|z_{1}-z_{2}| (if not the radius of convergence would be given by the corresponding distance function?) ----------------------------------- I also thank you for pointing me out (real) functions that are infinitely derivable at a point and yet not analytic, and whose derivatives at that point are not all zero (I'd never thought to simply add a polynomial to the classic examples which typically have all derivatives zero at the point in question, such as f=exp(-1/x^{2}) for x<>0 and 0 for x=0). Coz I'd asked my teacher how can one tell whether an inifinately derivable function at some point is analytic or not (without having to construct the series and seeing if it converges and where it converges to . . . ), and he said that as long as it had at least one derivative that was not zero it would be analytic . . . I wasn't happy with that (as I could find no proof), but couldn't think of a counter example:P Now I can xD Um, so is there any easy way to be able to look at a real function and tell whether it is analytic or not? Once again thank you very much Yyat, your help is much appreciated :)
Mathman, um, I think Arildno's version was correct? It's simply defining a new variable [tex] w=x-1 [/tex] no? Thanks for posting :P
Arildno, thank you :) Ya, I see that . . . you changed the variable. But I'm not sure what the point of doing that is . . . ? Thank you for posting :)
You mean [tex]\sqrt{10}[/tex]... If you ever dive deeper into complex analysis, which is one of the most beautiful parts of math, you will learn that the elementary functions like exp, sin are holomorphic. Moreover sums, products and quotients of holomorphic functions are holomorphic, so all rational functions are holomorphic (where defined). This is more or less the only useful distance for complex numbers, since it's just given by the norm. Showing that a function is analytic is in general not so easy unless that function is a simple combination of functions you already know are analytic. In the general case you will have to work with the definition directly. It is sometimes easier to see that a function is not analytic using the following theorem, which is a consequence of analytic continuation: If f:U->R is analytic, U is connected, A is a subset of U with accumulation point in U (for example an interval) and f is zero on A, then f is zero in U. Using this, it is easy to see that a bump function can not be analytic.
Yyat, yes, I meant [tex] \sqrt{10} [/tex], silly me:P About: Well, that certainly makes life a lot easier! Thank you again xD I've still got to read up on that last theorem you mentioned of analytic continuation . . . I can't really afford to go too deep into complex analysis right now but I would like to when I have a chance. Thank you for being so helpful :) Bobby
To find the taylor expansion it helps to use partial fractions: 1/1+z^2 = (1/2i)[1/(z-i)-1/(z+i)] From here the n^th derivative, and hence the taylor expansion about any point (other than z=\pm i) is really easy to calculate.