I need a bit of help with this simple question, it's not difficult at all but I just can't see what I did wrong. I'm overlooking a crucial thing A cannon is placed 40m away from a 37.5m high cliff. The cannon fires a ball with an angle of 45 degrees to the horizontal. The gravitational acceleration is 10 m/s^2. The question is: with what mininum initial speed must the ball be fired so that it just hits the top of the cliff? I tried to split the velocity in two components, both vx and vy. vx = v0x * t (since there is no net force in the x-direction) vy = v0y - gt (the only force that acts on the ball is gravity) Since the ball is at it's highest point it's velocity in the y direction must be equal to 0. In that case, v0y = gt, so t = v0y/g = v0y/10. I can substitute this time in the equation for the y- position of the ball and find what the initial speed must be to reach that height Sy = v0y*t - 1/2 * g * t^2 = v0y*t - 5*t^2 since t = v0y/10, and sy must be equal to 37.5 37.5 = 5/100 * v0y^2, v0y = sqrt(750) = 27.39 So v0 = 27.39/sin (45) =38.73 m/s However, my textbook gives as answer v0 = 80 m/s. What am I doing wrong?