Root or Ratio Test: Interval of Convergence

Click For Summary
SUMMARY

The discussion focuses on determining the interval of convergence for the series Σ(-x/10)^(2k) using the root and ratio tests. The series can be expressed as Σ((x^2/100)^k), leading to the conclusion that the series converges if |x| < 10. Both the ratio test and the root test confirm this result, indicating divergence for |x| > 10. The final interval of convergence is established as (-10, 10).

PREREQUISITES
  • Understanding of geometric series
  • Familiarity with the ratio test
  • Knowledge of the root test
  • Basic algebraic manipulation of inequalities
NEXT STEPS
  • Study the properties of geometric series and their convergence criteria
  • Learn more about the ratio test and its applications in series convergence
  • Explore the root test in detail, including its derivation and examples
  • Investigate other convergence tests such as the comparison test and integral test
USEFUL FOR

Students and educators in mathematics, particularly those studying calculus and series convergence, as well as anyone seeking to deepen their understanding of convergence tests in infinite series.

Fernando Revilla
Gold Member
MHB
Messages
631
Reaction score
0
I quote a question from Yahoo! Answers

Σ(-x/10)^(2k) how do I find the interval of convergence using the root or ratio test?

I have given a link to the topic there so the OP can see my response.
 
Physics news on Phys.org
We can express $\displaystyle\sum_{k=0}^{\infty}\left(\frac{-x}{10}\right)^{2k}=\sum_{k=0}^{\infty}\left(\frac{x^2}{100}\right)^{k}.$ Then,

$(a)$ Considering this series as a geometric series:
$$\left| \frac{x^2}{100} \right|<1\Leftrightarrow x^2<100\Leftrightarrow |x|<10$$
and the series is convergent iff $|x|<10.$

$(b)$ Using the ratio test:
$$\lim_{k\to \infty}\;\left| \left(\frac{x^2}{100}\right)^{k+1} \left(\frac{100}{x^2}\right)^{k} \right|=\frac{x^2}{100}<1\Leftrightarrow |x|<10$$
So, the series is convergent if $|x|<10$ and divergent if $|x|>10.$ If $x=\pm 1$ we get $\displaystyle\sum_{k=0}^{\infty}1=1+1+\ldots$ (divergent).

$(c)$ Using the root test:
$$\lim_{k\to \infty}\;\left| \left(\frac{x^2}{100}\right)^{k} \right|^{1/k}=\frac{x^2}{100}<1\Leftrightarrow |x|<10$$
So, the series is convergent if $|x|<10$ and divergent if $|x|>10.$

The interval of convergence is $(0,1).$
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K