Alright, so the problem I'm working on is basically as such: An object with an initial Velocity of 95m/s slides horizontally, constantly losing velocity. After traveling 8 km it comes to a stop. How long did the object take to travel 8 km. So here's how I went at it: δx = v0 + .5at^2 δx+v0 = .5(δv/t)t^2 2(δx+v0) = v(t^2)/t 2(δx+v0)/δv = (t^2)/t 2(δx+v0)/δv = t Then I plugged in: δx = 8000m v0 = 95m/s δv = -95m/s Which returned t = -170.421s (Which I thought was odd to begin with, as how can one have a negative value for time.) But I tried plugging that into my initial equation to see that it all worked out but instead of getting δx = 8000m I got δx = 8190m So obviously there is a flaw in my logic SOMEWHERE. I just don't know where. Help please!?