1. The problem statement, all variables and given/known data airplane files horizontally with constant speed of 260 mph at an altitude of 500m. Ignore height of this point above sea level. Assume acceleration due to gravity is g= 9.8m/s^2 After ejecting a package from the plane, how long will it take for it to reach sea level from time it is ejected? Assume package has an initial velocity of 260mph in the horizontal direction. What is the speed of the package when it hits the ground (mph)? 2. Relevant equations x = u t+ (1/2) a t^2 sqrt vx^2 + vy^2 (I don't know how to incorporate that into finding the speed though) v=voy+ay(t) 3. The attempt at a solution I've already found it will take 10.1s for the package to hit the ground and the horizontal distance it should be released from the plane is 1170m...but I'm not sure about finding vy. Please help. Thanks in advance.