1. The problem statement, all variables and given/known data A child running along level ground at the top of a 30-ft-high vertical cliff at a speed of 15 ft/s, throws a rock over the cliff into the sea below. Suppose the child's arm is 3ft above the ground and her arm speed is 25ft/s. If the rock is released 10ft from the edge of the cliff at an angle of 30 degrees, how long does is take for the rock to hit the water? How far from the base of the cliff does it hit. 2. Relevant equations V=initial velovity a=alpha=radians measurement of angle t=time s=initial height g=gravity=32ft/s x(t) = (V*cos(a))t y(t) = -(1/2)g(t^2)+(V*sin(a))t+s 3. The attempt at a solution I have identified these variables: Variables I know: V=25ft/s s=3ft a=pi/6 I also have solved 1 of the questions, time of flight by solving for this equation: 0 = -16t^2+(25/2)t+33 t=1.8789s Now trying to use t for the range I get x = (25* cos(pi/6))1.8789 =(approx) 40.680244 To get the distance from the base of the cliff, I subtract the distance from the edge from the range: Distance to bas of cliff = 40.680244 - 10 But this is incorrect can anyone please help, I have been at this for hours and it seems so basic.