- #1

NoahCygnus

- 96

- 2

## Homework Statement

"A ball is thrown vertically upward with a speed '

*' from a height '*

**v***' metre above the ground. The time taken for the ball to hit the ground is."*

**h**## Homework Equations

##s = ut + \frac{1}{2}at^2##

## The Attempt at a Solution

https://scontent.fdel3-1.fna.fbcdn.net/v/t1.0-9/17352002_706036796237108_6058839347900308786_n.jpg?oh=3f0a769b256f51becc5ee351dc25294b&oe=596F1319

https://scontent.fdel3-1.fna.fbcdn.net/v/t1.0-9/17264650_706036792903775_4158276045728207185_n.jpg?oh=ab150cc42047d1d745b8b84a53fe094e&oe=596262E0

So we throw the ball vertically upward with a velocity 'v' from a reference point 'o' as I have labelled in the diagram. The ball returns back to the reference point after an interval of t1, and the displacement in this time period is 0. I used the equation ##s = ut + \frac{1}{2}at^2## and solved the quadratic equation for t1.

Now the ball is back at o, but this time it has velocity -v, and is falling down. I assumed it took the ball time t2 to fall from o to h. Again using the equation ##s = ut + \frac{1}{2}at^2## and solving the quadratic equation for t2, I calculated t2.

Then I added t1 and t2 for the total time taken for the fall to reach the ground. But my answer doesn't match the answer given in book. The answer given in the book is ##t = \frac{v}{g}[1+\sqrt{1+\frac{2gh}{v^2}}]## . I wonder what did I get wrong. Thank you for your help.

Last edited by a moderator: