- #1
Joza
- 139
- 0
If an arrow is shot at an angle 45 to the horizontal, and lands in the ground a distance(same height as it was shot) 100m away, is it possible to work out the time taken? I can't figure it out...I have been messing around with equations, but because we don't have an initial velocity I can't get it.
Any ideas?
Any ideas?