I have showed my work and gotten an answer but it differs from the actual answer in the back of the book. Thanks for your help!! A driver in a car traveling at a speed of 60mi/h sees a deer 100m away on the road. Calculate the minimum constant acceleration that is necessary for the car to stop without hitting the deer (assuming that the deer does not move in the meantime). My answer is DIFFERENT from the actual answer: First I converted 60mi/h into m/s: (60mi/h) x (1609m/mi) x (1h/3600s) = 26.82 m/s Then I divided 26.82 by 100m : 100/26.82 = 3.73m/s^2 So it's -3.73m/s^2 because it is slowing down?? Did I do the problem wrong? It seems so short so I think I did. The acutal answer was -3.6m/s^2 What happened??