1. The problem statement, all variables and given/known data An archer stationed at the edge of a cliff shoots an arrow horizontally at 90ms^-1. The arrow falls down and hits a target. Assuming that the arrow is shot from a height of 20m, calculate: i. How long it takes the arrow to reach the target. (done) ii. How far horizontally the target is from the base of the cliff. (done) iii. The resultant velocity at impact and the angle this makes with the vertical. Mention one assumption made. 2. Relevant equations SUVAT equations: 3. The attempt at a solution Parts i. and ii. were basically tackled by splitting the problem in half via resolving components vertically and horizontally. Time taken for the arrow was found to be 2s and the target was found to be 180m away from the base of the cliff. What confuses me in part iii. is the assumption I have to make. I know that resultant velocity is the resultant vector from the horizontal and vertical components, and that the angle is found by resolving the vector diagram. Am I missing something obvious? Cheers.