# Convergence of Infinite Series

1. Sep 9, 2007

### Vagrant

1. The problem statement, all variables and given/known data
1+ $$\frac{\alpha\beta}{\gamma}$$ x + $$\frac{\alpha (\alpha+1)\beta(\beta+1)}{1.2.\gamma(\gamma+1)}$$$$x^{2}$$+.....

2. Relevant equations

3. The attempt at a solution
Using D'Alembert's ratio test, I get $$lim_{n\rightarrow\infty}$$$$\frac{U_{n+1}}{U_{n}}$$=x
so, x>1 diverging series
x<1 converging series
when x=1, Using Raabe's test I get
$$lim_{n\rightarrow\infty}$$$$n[\frac{U_{n}}{U_{n+1}}-1]$$=$$\gamma-\alpha-\beta$$
so, Series Converges if $$\gamma-\alpha-\beta$$>1
and diverges if $$\gamma-\alpha-\beta$$<1

However the book has given the answer to be
coverges if $$\gamma-\alpha-\beta$$>0
and diverges if $$\gamma-\alpha-\beta$$<0

Can anyone point out my mistake, please?

Last edited: Sep 9, 2007
2. Sep 9, 2007

3. Sep 9, 2007

### Avodyne

For Raabe's test wtih x=1, I get
$$\lim_{n\rightarrow\infty}n\!\left({\textstyle{U_{n}\over U_{n+1}}-1\right)=\gamma-\alpha-\beta+1$$

4. Sep 9, 2007

### Vagrant

My expression for $$U_{n}$$ = $$\frac{1.\alpha...(\alpha + n-1) 1.\beta...(\beta+n-1)}{1.2...(n-1) 1.\gamma...(\gamma+n-1)}$$

Using this I get $$\gamma-\alpha-\beta$$ from Raabe's test

Last edited: Sep 9, 2007