1. The problem statement, all variables and given/known data Determine the number of terms required to approximate the sum of the series with an error of less than .001 Sum ((-1)^(n+1))/(n^3) from n=1 to infinity 2. Relevant equations 3. The attempt at a solution I guess this is what you do 1/(n+1)^3 < 1/1000 and solving you get n+1 > 10 so 10 terms But that doesn't quite make sense to me, and I'm not sure why. Alternating series remainder theorem: |S-Sn| =|Rn|< or = to an+1 Could someone please explain this to me?