A rock is thrown horizontally from a tower point A, and hits the ground 3.5 s later at point B. A line from A to B makes an angle of 50 degrees with the horizontal.
Compute the magnitude of a initial velocity u of the rock.
The Attempt at a Solution
I started looking at the motion in x direction, which is
u the unknown.
Then i looked at the motion in y direction:
And here we have the gravity.
so i used the formula:
s=1/2 * g * t^2
1/2*9.81*(3.5)^2 =60 m/s^2
solving x i get 50.34 meters
But I'm not sure if I have done it corretly.
Thank you for your help.