A projectile is fired over level ground with an initial velocity that has a vertical component of 20 m/s and a horizontal component of 30 m/s. Using g = 10 m/s2, the distance from launching to landing points is:
The Attempt at a Solution
I used the first equation to solve for the time it took from start to finish. 0=20m/s(t)+0.5(-10m/s2)t^2 and that gave me 4 seconds to launch and land.
Next i used the same equation only now i used the x component so x=30m/s(4s) and the displacement was 120 meters.
Any errors you see?