A person drives a car for 20 seconds in a straight line with an initial velocity of 20m/s. During the entire course of the journey they applied breaks causing the car to decelerate at 5m/s2. How far will it be from the starting point after the given time? I've attempted the following: distance = 20 * 20 + 0.5 * -5m/s^2 * 20^2 However since I acquire a negative value, I'm convinced that my working is incorrect. What am I misunderstanding about the deceleration?