1. The problem statement, all variables and given/known data Let f be bounded on the interval c < x < infinity, and let lim g(x)x-->infinity = 0. Prove that lim f(x)g(x)x-->infinity = 0. Does this follow directly from Theorem 19? 2. Relevant equations Theorem 19 Assuming that f and g are each defined on a deleted neighborhood of x = b, and that lim f(x)x-->b = A and lim g(x)x-->b = B, then it is true that lim (f(x) + g(x))x-->b = A + B lim f(x)g(x)x-->b = AB lim f(x)/g(x)x-->b = A/B if B does not equal 0. 3. The attempt at a solution It cannot follow from the Theorem because you can't be sure the limit of f(x) even exists. So, do you have to prove that lim f(x) exists and then it would follow from the theorem that lim f(x)g(x) = A*0 = 0?