# QUESTION: Interval of Convergens for a series

1. May 10, 2006

### Hummingbird25

Hi

I have this series here

$$\sum_{n=1} ^{\infty} \frac{1}{x^2+n^2}$$

I need to show that the Radius of convergens $$R = \infty$$ and the interval of convergens therefore is $$(- \infty, \infty)$$

My question is to do this don't I use the ratio-test?

Sincerely Yours

Hummingbird25

Last edited: May 10, 2006
2. May 10, 2006

### eok20

a_n is the nth term, so:

$$\left| \frac{a_{n+1}}{a_n} \right | = \left| \frac{1}{x^2+(n+1)^2} \cdot \frac{x^2+n^2}{1} \right |$$

However, this goes to 1 as n goes to inifinity so this really doesn't help you. What you probably want to do is use comparison test with 1/n^2 to show that it converges for any x.

Last edited: May 10, 2006
3. May 10, 2006

### Hummingbird25

$$\sum_{n=1} ^{\infty} \frac{1}{x^2+n^2}$$

Then by the comparison test:

$$\frac{1}{x^2 + n^2} < \frac{1}{n^2}$$ ??

Sincerely

Hummingbird25

4. May 10, 2006

### eok20

Thats right, and since $$\sum_{n=1} ^{\infty} \frac{1}{n^2}$$ converges (by p-series), $$\sum_{n=1} ^{\infty} \frac{1}{x^2+n^2}$$ converges since every term is smaller.