1. The problem statement, all variables and given/known data A ball A is thrown upward to a height H with a velocity of 20m/s, another ball B is thrown down from H with a velocity of 20m/s and a third ball C is just dropped from the same height. Find which ball has the maximum velocity when it hits the ground. 2. Relevant equations v=u+gt , s=ut +1/2gt2 3. The attempt at a solution A - The velocity on hitting the ground must be the same as the projected velocity. So I think it's 20 m/s B - Since it starts with 20m/s from top ,and it's going to speed up because of gravity, it's reasonable to assume that it's final velocity will be greater than A's. C-This is where i'm confused. Using the formula v=u+gt and u=0 . v=gt .So the final velocity of C is solely dependent on t. Taking the falling part of A's journey, u=0 and v= 20m/s( from first assumption) using v=u+gt. gt=20 . Is the time for the falling of A and C the same and so the answer is A=C<B ?How do i proceed?