1. The problem statement, all variables and given/known data One of the fastest recorded pitches in major league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/hr. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically ( in feet ) by the time it reaches home plate, 60.5 feet away? 2. Relevant equations T1 = Delta(X)/ V1 Delta (Y) = Vo T + (1/2) A T^2 3. The attempt at a solution Delta (Y) = (0)t + (1/2) (-21.9 mi/hr) (.000099 hr) Delta (Y) = -1.07 * 10^-7 mi OR 5.66 * 10 ^-4 FT This answer doesn't seem correct at all, any help?