1. The problem statement, all variables and given/known data if Ʃa_n converges, with a_n > 0, then Ʃ(a_n)^2 always converges 2. Relevant equations n/a 3. The attempt at a solution I am at a complete loss. I have tried using partial sums, cauchy criterion, and I tried using ratio test which seems to work but I am not sure. Since Ʃa_n converges then by ratio test lim n->∞ a_n+1 / a_n < 1 Now we apply ratio test to Ʃ(a_n)^2 lim n→∞ (a_n+1)^2 / (a_n)^2 = (lim n→∞ a_n+1 / a_n)^2 < 1^2 = 1 Thus by ratio test Ʃ(a_n)^2 converges. This working did not utilize the condition a_n > 0, so it seems suspect to me.