- #1
sjnt
- 11
- 0
Homework Statement
A car is traveling on a straight road on a stretch that contains cities A, B, C and D.
The distance from city A to city D is 60 miles and the cities are evenly placed along the route. There are cell phone towers in each city. Each tower has a range of 10 miles in all directions.
Suppose that the velocity of the car is given by
r(t)= 60t, t< or equal to 1
120-60t, t> or equal to 1
*where t is measured in hours and r(t) in mph.
Homework Equations
1. Suppose that at t=0 the car is at City A. how long does it take the car to make the trip?
2. For how long is the car within range of the station in City A? City B?
3. Do you think it is possible to start driving from city A, stop driving at city D, and maintain the same percentage of time within range of each location? If so, how?
The Attempt at a Solution
My problem isn't integrating but finding the equations to set up for integration.
1. This is what I set up to integrate. Is this right?
60=∫120-60t dt for t=0 and t=x
60=(120x-30x²)-0
30x²-120x+60=0
After using the quadratic formula I got,
x=2+√2 hours
2. This is the integral I set up for city A
∫60t dt for t=0 to t=x
After some integration and simplification I got,
x=1/√3 hours in city A
Would I do the same for city B?
3. I suppose you could tell by setting up a graph. But what do I graph? And what am I suppose to look for?
Thanks!