A projectile is shot from the edge of a cliff 125 m above ground level with an initial speed of 105 m/s at an angle of 37 degrees with the horizontal. What is the time it takes the projectile to hit the ground? How far from the base of the cliff does it land? Now my major problem is that I don't know how to set up the motion equation using the 37 degrees.....help!! i've done the problem before and I turned it in and it was wrong.