1. The problem statement, all variables and given/known data A package is to be dropped from an airplane so that it hits the ground at a designated spot near some campers. The airplane, moving horizontally at a constant velocity of 140 km/h, approaches the spot at an altitude of .500 km above level ground. Having the designated point in sight, the pilot prepares to drop the package. What should the angle be between the horizontal and the pilot's line of sight when the package is released? 2. Relevant equations d = 1/2at^2 v = at 3. The attempt at a solution I know that the velocity components of the package once it's dropped: vx = 140 km/h vy = -gt The displacement components, where t is time: dx = 140 km/h * t dy = .50-1/2gt^2 For dy, I solved for time it takes for the package to hit the ground: .32 seconds. I also calculated that the package will travel a horizontal distance of 12.4 m in that time. Where do I go from here in determining the angle? I think I'm missing something obvious here.