An object has an initial velocity given by V = vi + vj, where v=10m/s. If an acceleration of A = ak, where a = 10 m/s^2 is applied for 10 seconds determine the final velocity for the object. How far did the object travel in 10s? How far from the z axis is the object? Well to start. I used the equation: Vf = Vi + at Vf = (10i + 10j) + (10k)(10s) Vf = 10i + 10j + 100k Then to determine distance, I used: D^2 = (change in x)^2 + (change in y)^2 + (change in z)^2 so. D = sqrt.(10,200) m? I am not sure if I calculated this right. Then for the last part of the question, I am not really sure how to determine this. I know that the object moved 100m in the x direction, and 100m in the y direction, and I think it moved 500m in the z direction given the Vf= 100m/s in z direction and Vi = 0m/s in z direction. So I am assuming we have to use pythagorean theorem again? I am not sure. Please help, and comment on any of the previous answers please.