You are traveling at a constant speed vM, and there is a car in front of you traveling with a speed vA. You notice that vM>vA, so you start slowing down with a constant acceleration a when the distance between you and the other car is x. What relationship between a and x determines whether or not you run into the car in front of you?
x = v0*t - 1/2*a*t^2
The Attempt at a Solution
After a time t, the distance between the cars must be something larger than x if they don't want to crash, so:
vM*t-1/2*a*t^2 - vA*t > x
What should be the next step after this?