A jet plane comes in for a landing with a speed of
100 m/s, and its acceleration can have a maximum magnitude
of 5.00 m/s^2 as it comes to rest. (a) From the
instant the plane touches the runway, what is the minimum
time interval needed before it can come to rest?
Kinematics I suppose, but I'm trying to solve it with just calculus.
The Attempt at a Solution
So my idea was that if the acceleration is constant at -5m/s^2 then I should be able to take the integral of this and get the velocity function. So what I end up with is
v(t) = - 5/2 (t^2) + 100
Solve for v(t) = 0 etc.
But the answer is wrong. It's supposed to be 20s.
Where is my logic flawed?