1. The problem statement, all variables and given/known data Queuing Theory (study of lines for stores) says that for a drive through window at a Macdonalds, the function f(x)= 9/(x(x-9)) represents the average time in hours a customer will wait in line. X=average number of people an hour. How long will a customer have to wait in line (on average)? 3. The attempt at a solution Not sure at all on how to find an average time out of this function.