Root vs. ratio in determining radius of convergence of a power series

brush
Messages
3
Reaction score
0
Hi everyone :smile:

When determining the radius of convergence of a power series, when should I use the ratio (a[sub n+1] / a[sub n]) test versus the root (|a[sub n]|^(1/n)) test?

I know that I'm supposed to use the ratio only when there are factorials, but other than that, are these tests basically interchangeable?

Also, are there any differences in usage/application of the tests in the context of determining the radius of convergence of a power series?

Thanks
 
Last edited:
Physics news on Phys.org
So, for instance, how do I determine the radius of convergence of:

\sum^{\infty}_{n=0}\left(\frac{x^{n}}{n^{2}+1}\right)

Thanks again :smile:
 
Last edited:
Hi brush!

Both the ratio and the root test "work" in all cases (they are proven theorems), but which one is easier to use depends on the concrete form of the a_n. The ratio test is probably used more often then the root test.

In your particular example, try the ratio test. :wink:
 
Thank you for the reply, yyat! :smile:

The confusion I was having was because I kept getting different results from the ratio and root test, but I have figured out what I was doing wrong.

In the case above, both the root and ratio tests should yield (I think):

\left|x\right| = 1
 
Yes, the radius of convergence is 1, that means the series converges for |x|<1. Note though that this tells you nothing about convergence at points with |x|=1, that needs to be checked separately.
 
Back
Top