r_swayze
- 65
- 0
If the vertical component of the velocity is 20m/s and the horizontal component is 30m/s and g = 10m/s^2, how far does the projectile land from its initial launch point?
the answer I got was 120m, is this correct?
heres my method:
use pythagorean theorem to find initial velocity from the vertical and horizontal component
then find the angle of the launch with vx = v0 cos (theta)
then use R = v0^2 / g sin 2(theta) to find range
the answer I got was 120m, is this correct?
heres my method:
use pythagorean theorem to find initial velocity from the vertical and horizontal component
then find the angle of the launch with vx = v0 cos (theta)
then use R = v0^2 / g sin 2(theta) to find range