A delivery truck wants to speed up its deliveries by dropping its packages into moving trucks. A worker is positioned in an overpass directly above a straight, level road to drop the packages into the truck at the correct time. One day, a delivery truck starts from rest and drives along the road with a constant acceleration of 1/2g. A package is released at the correct instant to land in the truck. If the overpass was 30m above the truck and the truck started from a position 100m from the point of impact, how long after the truck started did the employee wait before dropping the package? (Answer is 3.9s)
The Attempt at a Solution
The givens I tried to draw out from this problem were:
Average acceleration = 1/2g = 1/2(9.8m/s^2) = 4.9m/s^2
Displacement of package being dropped to truck = 30m
Initial displacement of truck = 100m
Time = ?
I don't know what equations to use in order to solve this... I'm assuming we're dealing with 2 different things at once... but I just don't know what to do...