1. The problem statement, all variables and given/known data A plane flies at 900m and has to deliver a package. It flies at 180 m/s North at 15° below the horizontal. The ships velocity is 40 m/s and it is travelling due north. At what horizontal distance from the ship must the package be dropped for it to land on the ship? 2. Relevant equations I know to use the equations of motion but I don't understand the problem. 3. The attempt at a solution Is the height of the plane changing? The ship and the plane only have a velocity in one direction, where as the package once released has velocity in two (x and y) right? How do I even begin to set this problem up?