1. The problem statement, all variables and given/known data A 500kg object is launched from the top of a 12m building with intial speed of 5.2m/s and angle of 38degrees. How far away does object land? 2. Relevant equations y=vit + 1/2at^2 quadratic formula d=vt 3. The attempt at a solution first i separate the initial velocity into its x and y coordinates. Using cos and sin. Next I input this found value of the y direction velocity into the y=vit + 1/2at^2. I rearrange this formula and solve it as a quadratic for time, and I use the positive time value which I got as 1.92s. Next I multiply this time value by the velocity in the x direction I got from doing the cos of the initial velocity. I end up getting the object lands 7.9 m away from the building. Can anyone check and see if this is right? Also the mass was negligble right?