## Projectile Motion

1. The problem statement, all variables and given/known data
airplane files horizontally with constant speed of 260 mph at an altitude of 500m. Ignore height of this point above sea level. Assume acceleration due to gravity is g= 9.8m/s^2

After ejecting a package from the plane, how long will it take for it to reach sea level from time it is ejected? Assume package has an initial velocity of 260mph in the horizontal direction. What is the speed of the package when it hits the ground (mph)?

2. Relevant equations

x = u t+ (1/2) a t^2
sqrt vx^2 + vy^2 (I don't know how to incorporate that into finding the speed though)
v=voy+ay(t)

3. The attempt at a solution

I've already found it will take 10.1s for the package to hit the ground and the horizontal distance it should be released from the plane is 1170m...but I'm not sure about finding vy. Please help. Thanks in advance.
 PhysOrg.com science news on PhysOrg.com >> King Richard III found in 'untidy lozenge-shaped grave'>> Google Drive sports new view and scan enhancements>> Researcher admits mistakes in stem cell study
 Admin One has to find the horizontal velocity vx and the vertical velocity vy, and add the two vectors. v = $$\sqrt{v_x^2\,+\,v_y^2}$$
 But I don't know vy!

Recognitions:
Homework Help

## Projectile Motion

it appears that v_y was initially zero, for there was only horizontal velocity of 260mph... so simply v_y = u_y + a t and the speed you are after is probably the magnitude of your vector (need to combine the x and y components)