1. The problem statement, all variables and given/known data An object thrown up from a cliff 10 m/s reaches a velocity of 20 m/s [down] as it lands. If acceleration due to gravity is 9.8 m/s2, what is the object's displacement? How long did it take for the object to land from the time it was thrown? 2. Relevant equations I'm not sure if all of the below are relevant, or perhaps some equations are missing. Δx = v0t + 1/2at2 (displacement = initial velocity x time + 1/2 acceleration x time squared) vf2 = v02 + 2aΔx (final velocity squared = initial velocity squared + 2 acceleration x displacement 3. The attempt at a solution t1 = v/a = 1.02 s t2 = v/a = 2.04 s t1 + t2 = 3.1 s Not sure how to find the displacement, but the answer key says it's -15m. How do I find it?