collectedsoul
- 76
- 0
A person standing at the top of a hemispherical rock of
radius R kicks a ball (initially at rest on the top of the rock)
to give it horizontal velocity vi. (a) What
must be its minimum initial speed if the ball is never to hit
the rock after it is kicked?
I tried solving this but am getting the wrong answer. Please tell me what I'm doing wrong. I first took the vertical distance the ball travels and used R= 0 x t + 1/2 g t2. So I got time t of travel as sq. root (2R/g). Then I took this travel time and used it for the equation of motion in the horizontal direction. So in the horizontal, R < vi x t + 1/2 a t2. Since a is 0, taking time from the above equation, I get vi > (Rg/2)1/2.
But the solution says the answer is (Rg)1/2. What am I doing wrong?
radius R kicks a ball (initially at rest on the top of the rock)
to give it horizontal velocity vi. (a) What
must be its minimum initial speed if the ball is never to hit
the rock after it is kicked?
I tried solving this but am getting the wrong answer. Please tell me what I'm doing wrong. I first took the vertical distance the ball travels and used R= 0 x t + 1/2 g t2. So I got time t of travel as sq. root (2R/g). Then I took this travel time and used it for the equation of motion in the horizontal direction. So in the horizontal, R < vi x t + 1/2 a t2. Since a is 0, taking time from the above equation, I get vi > (Rg/2)1/2.
But the solution says the answer is (Rg)1/2. What am I doing wrong?