Projectile Motion

  1. 1. The problem statement, all variables and given/known data
    airplane files horizontally with constant speed of 260 mph at an altitude of 500m. Ignore height of this point above sea level. Assume acceleration due to gravity is g= 9.8m/s^2

    After ejecting a package from the plane, how long will it take for it to reach sea level from time it is ejected? Assume package has an initial velocity of 260mph in the horizontal direction. What is the speed of the package when it hits the ground (mph)?

    2. Relevant equations

    x = u t+ (1/2) a t^2
    sqrt vx^2 + vy^2 (I don't know how to incorporate that into finding the speed though)

    3. The attempt at a solution

    I've already found it will take 10.1s for the package to hit the ground and the horizontal distance it should be released from the plane is 1170m...but I'm not sure about finding vy. Please help. Thanks in advance.
  2. jcsd
  3. Astronuc

    Staff: Mentor

    One has to find the horizontal velocity vx and the vertical velocity vy, and add the two vectors.

    v = [tex]\sqrt{v_x^2\,+\,v_y^2}[/tex]
  4. But I don't know vy!
  5. mjsd

    mjsd 860
    Homework Helper

    it appears that v_y was initially zero, for there was only horizontal velocity of 260mph... so simply v_y = u_y + a t and the speed you are after is probably the magnitude of your vector (need to combine the x and y components)
  6. Astronuc

    Staff: Mentor

    As mjsd indicated, assume that the package is dropped with zero vertical velocity, then accelerates downward under the influence of gravity.

    One was able to find the time at which the package struck the ground. One should be able to find the vertical velocity after falling 500 m with an acceleration of g.
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?
Similar discussions for: Projectile Motion