1. The problem statement, all variables and given/known data Water is pumped out of a lake at the rate R(t) = 12*SQUAREROOT(t / t+1) cubic meters per minute, where t is measured in minutes. How much water is pumped from time t = 0 to t = 5 ? 2. Relevant equations N/A 3. The attempt at a solution I cannot figure out if the integration of times 0 and 5 are all that is required or if I will have to also convert cubic meters before completing problem.