# Root vs. ratio in determining radius of convergence of a power series

1. Mar 16, 2009

### brush

Hi everyone

When determining the radius of convergence of a power series, when should I use the ratio (a[sub n+1] / a[sub n]) test versus the root (|a[sub n]|^(1/n)) test?

I know that I'm supposed to use the ratio only when there are factorials, but other than that, are these tests basically interchangeable?

Also, are there any differences in usage/application of the tests in the context of determining the radius of convergence of a power series?

Thanks

Last edited: Mar 16, 2009
2. Mar 16, 2009

### brush

So, for instance, how do I determine the radius of convergence of:

$$\sum^{\infty}_{n=0}\left(\frac{x^{n}}{n^{2}+1}\right)$$

Thanks again

Last edited: Mar 16, 2009
3. Mar 16, 2009

### yyat

Hi brush!

Both the ratio and the root test "work" in all cases (they are proven theorems), but which one is easier to use depends on the concrete form of the a_n. The ratio test is probably used more often then the root test.

In your particular example, try the ratio test.

4. Mar 16, 2009

### brush

Thank you for the reply, yyat!

The confusion I was having was because I kept getting different results from the ratio and root test, but I have figured out what I was doing wrong.

In the case above, both the root and ratio tests should yield (I think):

$$\left|x\right| = 1$$

5. Mar 16, 2009

### yyat

Yes, the radius of convergence is 1, that means the series converges for |x|<1. Note though that this tells you nothing about convergence at points with |x|=1, that needs to be checked separately.