I've been trying this question, but I doubt the first answer... An airplane with speed 97.5m/s is climbing upward at an angle of 50.0degrees with respect to the horizontal. When the plane's altitude is 732m, the pilot releases a package. (a) Calculate the distance along the ground, measured from a point directly beneath the point of release, to where package hits the ground. (b) Relative to the ground, determine the angle of velocity vector of package just before impact. For (a): I started used this equation Y = (Vf^2- Vi^2)/2a; plugged in the given values (97.5)^2/(2*9.8) = 485m. Is this correct. How do I start Q(B)? I appreciate the help.