- #1

freshman2013

- 43

- 0

_{n}is decreasing for all n. The way the book says to do it is to take the derivative of f(x) with f(x)=a

_{n}. However, if I know that the limit as n approaches infinity abs value a

_{n}approaches 0, and that a

_{n}(w/o the (-1)^n part) is positive for all n>N, shouldn't that enough to prove that a

_{n}is decreasing to infinity. All the examples I did seem to follow this reasoning. Example: (-1)^(n-3)* sqrt(n)/(n+4). Clearly, if n is a really big positive number, then sqrt(n)/(n+4) can't be negative and the limit as it goes to infinity zero. Then the only way I can see it approaching zero is by decreasing. Might there be exceptions to this case and if so give an example? The only reason I'm asking this is that taking the derivative seems like unnecessary work to me. If I explain this on a test instead of taking a derivative, might the professor have any reason to takes points off?