Hi, I need someone to check if I did this right. As I got the q's off a PDF with no answers. 1. The problem statement, all variables and given/known data A projectile is shot from the edge of a cliff 125 m above the ground. It has an initial velocity 50.0 m/s 37.0° above the horizontal at the launch point. a) Determine the time taken to reach the ground. b) How far is this point from the base of the cliff? c) What is the magnitude and direction of the final velocity just before the point of impact? It doesn't specify what value to use for gravity, I assumed g, or a, is = -9.8m/s^2 (Doesn't matter since there's no answers anyway) 2. Relevant equations v = u + at s = ut (horizontal sense) 3. The attempt at a solution I used this to find the time for a) Vv = Uv + Avt 0 = 50 * sin37 - 9.8t 9.8t = 50 * sin37 t = 3.1s b) Sh = Uh * t = 50 * cos37 * 3.1 Sh = 123.79m c) Vertical component before impact: Vv = Uv + Avt = 60.47m/s I then used Pythagoras' to find v = 72.47m/s and theta = 56'33'' above the horizontal. I feel that I went wrong somewhere, can someone check if these are right/close. Not sure If I copied everything down correctly either, my page is a bit of a mess. edit - I just noticed I didn't once use the vertical distance, I think I'll have another look at it. edit 2 - Sv = Uvt + 1/2At^2 125 = 50 sin 37t -4.9t^2 = 30.1t - 4.9t^2 -4.9t^2 + 30.1t - 125 = 0 4.9t^2 - 30.1t +125 = 0 And... I'm getting math error when using quadratic formula. Not quite sure where to go from here...I realise now that the value for t that I had would only reach to the same height on the other side of the parabola... I'll have a look at this question again later.