1. The problem statement, all variables and given/known data Attempting to stop on a slippery road, a car moving at 80 km/h skids across the road at a 30 degree angle to its initial motion, coming to a stop in 3.9 s. Determine the average acceleration in m/s^2, using a coordinate system with the x axis in the direction of the car's original motion and the y axis toward the side of the road to which the car skids. 3. The attempt at a solution I did a whole chunk of workings but they're aren't working out to give me a sensible answer. Could someone give me a leg up? 0ms^-1 = 22.2ms^-1 + acos30°(3.9s) a = -6.57ms^-2 vx = 22.2ms^-1 + (-6.57ms^-1 cos30)(3.9s) vy = (-6.57ms^-1 sin30)(3.9s) I square root the square of vx and vy to get the resultant, then, divide |v| by t = 3.9s but it's not yield -5.7ms^-1. What is wrong here?