- #1
adam.kumayl
- 6
- 0
In the Monkey-Dart experiment (where a monkey let's go of a vine exactly the same time a dart is shot at him), the dart is known to reach the monkey because the distance from the vine to where he ends up is equal to difference between where the dart was aimed and where the dart ends up : both 1/2gt^2.
My question is that the horizontal velocity of the dart obviously increases as Alpha (the angle the dart was shot upwards at) decreases. Meaning if the dart is shot directly up (90 degrees), the dart has no horizontal velocity. So if the lower curves have a higher horizontal velocity (vcos(alpha)) as we know they do, how is it that it takes them longer to get to the SAME x position than it takes one of the trajectories with a higher alpha angle?
Thanks!
My question is that the horizontal velocity of the dart obviously increases as Alpha (the angle the dart was shot upwards at) decreases. Meaning if the dart is shot directly up (90 degrees), the dart has no horizontal velocity. So if the lower curves have a higher horizontal velocity (vcos(alpha)) as we know they do, how is it that it takes them longer to get to the SAME x position than it takes one of the trajectories with a higher alpha angle?
Thanks!