1. The problem statement, all variables and given/known data Assume that the series(an) is convergent and that an >= 0 for all n in N. Prove that the series((a^2)n) converges. 2. Relevant equations 3. The attempt at a solution Alright, this is what I've got so far: Assume that the series of an is convergent and that an>=0 for all n in N. In order for the series to be convergent, that would mean that the sequence (an) converges to 0. By definition of convergence, that would mean for epsilon greater than 0, there exists an N in N so that for n>=N: |an|<epsilon and furthermore: -epsilon<an<epsilon This is where I get stuck... am I allowed to just multiply through by an to show that -e(an)<((a^2)n)<e(an)? And since series of (an) converges, and e is a constant, that would mean that series(epsilon*an) also converges, and by the comparison test that would mean that: series((a^2)n) converges as well. I don't know if what I'm doing is right, if it isn't then any tips would be great!