1. The problem statement, all variables and given/known data A baseball is hit into the air at an initial speed of 36.6m/s and an angle of 50degrees above the horizontal. At the same time, the center fielder starts running away from the batter and catches the ball 0.914m above the level at which it was hit. If the center fielder is initially 110m from home plate, what is his average speed? 2. Relevant equations final velocity = initial velocity + (acceleration * time) displacement = 1/2 (initial velocity + final velocity) time 3. The attempt at a solution It seems to me that there is some information missing: It says the fielder catches the ball 0.914m above the level at which it was hit, but it doesn't say how high the ball was when it was hit. It also says the fielder is initially 110m away from home plate, but doesn't say how far away he is when he makes the catch or how far he runs backwards from the initial position.