1. The problem statement, all variables and given/known data A person standing at the top of a hemispherical rock of radius R kicks a ball (initially at rest on the top of the rock) to give it horizontal velocity Vi. What must be its minimum initial speed if the ball is never to hit the rock after it is kicked? With this initial speed how far doesit go from the base of the rock when it hits the ground. 3. The attempt at a solution The problem I'm having with this is that they don't initialize any variables. tey don't tell you R or the time it took. They only say that it is horizontal. the problem is a 2d planer by the way. So i realized that at least the magnitude of the ball must always be greater than the magnitude of the radius at any x position, otherwise the ball hit the circular rock. I'm probably just short of a logical comprehension of the question. does anyone know how i might be able to solve this. It seems like the problem isn't really plugging values into formulas but more logical. i visualized the problem as the balls initial position is ( 0,R ) when R is the radius and the origin is the middle of the hemispherical rock. Then the final position of the rock would be (D + R, 0 ) D being the distance from the base of the rock to where it landed." The only other piece of the puzzle would be that the ball falls at a -9.8 m/s^2 acc. because of gravity obviously. any input or help is GREATLY appreciated.