- #1

- 13

- 0

## Homework Statement

An airplane with a speed of 17.6 m/s is climbing upward at an angle of 42° counterclockwise from the positive x axis. When the plane's altitude is 840 m the pilot releases a package.

Calculate the distance along the ground, measured from a point directly beneath the point of release, to where the package hits the earth.

known:

v (initial) = 17.6 m/s

direction 42 degrees

height (horizontal) = -840 m

a in the y direction: -9.8 m/s^2

a in the x direction: 0 m/s^2

unknown:

v initial in the y or x direction

time

distance (vertical)

## Homework Equations

d=v(initial)*t+1/2*a*t^2 in the y direction

and

d = (avg velocity)(time) in the x direction

## The Attempt at a Solution

the question is asking what is the distance from the point of the release of the package to when it hits the ground. i calculated the time it would take for the package to reach the ground and used v initial in the y-direction to be 0 since that is the initial velocity in the y direction when at the max height of the projectile (-840m for the package). i plugged into the equation d=v(initial)*t+1/2*a*t^2 --> -840 = 0*t+1/2*-9.8*t^2 = 13.1 s. THEN i found that v intial in the x direction would be 13.1 m/s using trigonometry (13.1=cos42*17.6m/s). By plugging into d=vt, i get 13.1 m/s * 13/1 s = about 171 m. apparently this is the wrong answer. can anyone help me with this? thanks!