1. The problem statement, all variables and given/known data A car drives off a cliff that is 100m high. It has to land in water and the water starts 30m away from the cliff. Its goal is to land 90m into the water. How fast must the car be going to land at that point in the water. Air resistance is negligible. v=0 v0=? a= -9.91m/s2 Δy=100m Δx=120m 2. Relevant equations My plan was to use kinematics and determine the time it takes for the car to fall using Δy= vt - 1/2(a)t2, then put that into Δx= 1/2(v+v0)t. Isn't it true that whether you throw something horizontally or just drop it, it takes the same time to hit the ground? Why does that not apply here regarding the horizontal distance and velocity? 3. The attempt at a solution My answer is wrong. I get 53 m/s (this seems like a lot anyway).