(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

An airplane with a speed of 45.2 m/s is climbing upward at an angle of 40° with respect to the horizontal. When the plane's altitude is 540 m, the pilot releases a package.

(a) Calculate the distance along the ground, measured from a point directly beneath the point of release, to where the package hits the earth.

(b) Relative to the ground determine the angle of the velocity vector of the package just before impact. (clockwise from the positive x axis)

We know the initial velocity for the X and Y components, the height at which the plane drops the package and the angle at which the plane is ascending in respect to the horizontal.

2. Relevant equations

Delta Y = 1/2(a)t^2

V= Vi + at

X = Vt

3. The attempt at a solution

I found the total time it would take for the package to hit the ground from the time that it was released to be Sqrt(Y/g) = Sqrt(540/9.8) = 10.5 seconds. But, the package has an initial velocity in the Y direction so I thought to use the Y velocity component (45.2sin40) divided by the acceleration to find the additional time that the entire flight would take because right after the plane releases the package, the package travels a certain distance upwards due to the velocity, and then proceeds to travel in the downward direction. I understand that concept, but after that, I have no idea where to go form here.

PLEASE HELP!! I have my College Physics Midterm due by Sunday Night! This is one of the practice problems and I don't understand this concept and I just KNOW my teacher, there will be one of these on the midterm. Thank you everyone :)

**Physics Forums - The Fusion of Science and Community**

# Projectile Motion problem with Plane dropping package

Have something to add?

- Similar discussions for: Projectile Motion problem with Plane dropping package

Loading...

**Physics Forums - The Fusion of Science and Community**