1. The problem statement, all variables and given/known data A ball is thrown vertically upwards with a speed of 10m/s from a point of 2m above the horiztonal ground. a) Calculate the length of time for which the ball is 3m or more above the ground. 2. Relevant equations v^2=u^2 + 2as v=u+at 3. The attempt at a solution So theres two ways of doing this, theres on way which my teacher taught us, but I forgot it. Then theres the logical 'sledgehammer' way. Which is: s=1 v=10 a=-9.8 v=? Work out the velocity at 1 meter above the ground, which is 3m above the ground then, and then find out the time taken (Using the velocity we found) for the ball to reach its maximum height (v=0) and multiply it by two to find the time for when its falling. v^2=u^2 + 2as v= SQRT (u^2 + 2as) v = SQRT (10^2 + 2x-9.8 x 1) <- Leave in this form to maintain accuracy. Then using: (v-u)/a = t (Derived from v=u+at) Where u = SQRT(10^2 + 2x-9.8 x 1) and v=0 [-SQRT(10^2 + 2x-9.8 x 1)] / -9.8 = Time to Reach the top. Multiply this by two to get the total time which is 1.83s. Nothing wrong with using this method and it would attain full marks in an exam, but I'd like to learn an easier way of doing it (The way the examiners want us to). Any help would be greatful!. Thanks in advance.