# Projectile motion airplane speed problem

An airplane with a speed of 97.5 m/s is climbing upward at an angel of 50.0 degrees with respect to the horizontal. When the plane's altitude is 732m, the pilot releases a package. (a) Calculate the distance along the ground, measured from a point directly beneath the point of release, to where the package hits the earth. (b) Relative to the ground, determine the angle of the velocity vector of the package just before impact.

I started this problem by figuring out the velocity components for both the horizontal motion and the vertical motion of the package.
initial velocity of x=the inital velocity*cos(50.0 degrees)=62.7 m/s
initial velocity of y= the inital velocity *sin (50.0 degrees)=74.7 m/s
Then I thought I should find the time it takes for the package to hit the ground. using the equation y=initial velocity component y*t + 1/2(acceleration component of y*t*t)-->-732m=(74.7m/s)t + 1/2(-9.80m/(s*s))t*t = 22s
My question: Is this the direction I should be following to solve this problem or am I totally hosed up on this? Don't want the answer to the overall question but a hint to help me figure this out.
ps sorry about the t*t notation I could not figure out how to do superscripts or subscripts here.