# Interval Of Convergence

1. Apr 9, 2013

### twoski

1. The problem statement, all variables and given/known data

Find the interval of convergence for $(x-10)^{n}/10^{n}$

3. The attempt at a solution

If i use the ratio test on this, i end up with (x-10)/10, which doesn't make sense to me since there is no "n" in this result. Is there another method i need to be using?

2. Apr 9, 2013

### HallsofIvy

Staff Emeritus
There shouldn't be an "n"! The ratio test requires that you take the limit as n goes to infinity. The ratio test says that the converges absolutely if that fraction is less than 1. So your condition is that (x- 10)/10< 1.

3. Apr 9, 2013

### Ray Vickson

Does the series converge/diverge for x = 5? for x=-5? For x = 7? For x = 21? etc., etc.

4. Apr 9, 2013

### Zondrina

After you've applied the absolute ratio test to find your radius of convergence, you get (1/10)|x-10|.

Now we know the radius of convergence is |x-10|<10 breaking down the absolute value we can obtain the interval of convergence :

-10 < x-10 < 10
0 < x < 20

You can check the endpoints at 0 and 20 to see if it converges or not.

5. Apr 9, 2013

### twoski

So if i plug in 0, it diverges since it alternates between positive and negative indefinitely.

If i plug in 20, i get 1 since 10^n/10^n is 1. Therefore the test is inconclusive.

So i get (0,20).

Last edited: Apr 9, 2013
6. Apr 9, 2013

### Staff: Mentor

I agree with your result, but not with your method. When you check the endpoints, you need to use something other than the ratio test, since you already know that the ratio test is inconclusive when the ratio (its absolute value) is 1.

At the endpoints (i.e., when x = 0 and x = 20) what are the actual series you get? You should have two series that contain only constants - no variables.