Alternating series (Leibniz criterion)

Click For Summary
SUMMARY

The discussion centers on the convergence of alternating series, specifically the series \(\Sigma (-1)^n | \frac{1}{n^2} \sin(n)|\). It is established that the series converges despite the sequence \(a_n = |\frac{1}{n^2} \sin(n)|\) not being monotonically decreasing. The key takeaway is that the conditions for convergence provided by the Leibniz criterion are sufficient but not necessary, allowing for convergence even when the terms do not decrease in magnitude. The series can be shown to converge absolutely by comparison with \(\frac{1}{n^2}\).

PREREQUISITES
  • Understanding of alternating series and the Leibniz criterion
  • Familiarity with convergence tests in calculus
  • Knowledge of absolute convergence and comparison tests
  • Basic trigonometric functions and their properties
NEXT STEPS
  • Study the proof of the Leibniz criterion for alternating series
  • Learn about absolute convergence and its implications
  • Explore the comparison test for series convergence
  • Investigate the behavior of trigonometric functions within series
USEFUL FOR

Students of calculus, mathematicians, and anyone studying series convergence, particularly those interested in alternating series and convergence criteria.

Damidami
Messages
93
Reaction score
0
I read that an alternating series \Sigma (-1)^n a_n converges if "and only if" the sequence a_n is both monotonous and converges to zero.

I tried with this series:

\Sigma_{n=1}^{\infty} (-1)^n | \frac{1}{n^2} \sin(n)|

in the wolfram alpha and seems to converge to -0.61..., even if a_n = |\frac{1}{n^2} \sin(n)| is not monotonous decreasing.

What am I doing wrong? Is the monotone condition necesary for this test, but the fact that a_n is not monotonous does not guarantee if the series converges or not?

Thanks.
 
Physics news on Phys.org
The conditions given are sufficient, not necessary. It is possible for the terms to not be decreasing in magnitude, and for the whole series to still converge.

For your particular series, you can prove it converges absolutely by comparison with 1/n2
 
Office_Shredder said:
The conditions given are sufficient, not necessary. It is possible for the terms to not be decreasing in magnitude, and for the whole series to still converge.

For your particular series, you can prove it converges absolutely by comparison with 1/n2

You are right, thanks! I confused because in class we saw this statement:

Let a_k > 0 be a sequence monotonous decreasing. Then \Sigma_{n=1}^{\infty} (-1)^k a_k converges \Leftrightarrow a_k \to 0

The \Rightarrow part of the proof is simply the necesary condition for any series to converge that a_k \to 0.

But I thoght the \Leftarrow part would imply everything besides it: both that the series converges and that a_k is monotonous decreasing.

So it's a logical question now: when we have a double implication like above, the "leftarrow" implication only implies what the thesis says, but the hipothesis is the same. Am I right now?

Thanks!
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K