In the Monkey-Dart experiment (where a monkey lets go of a vine exactly the same time a dart is shot at him), the dart is known to reach the monkey because the distance from the vine to where he ends up is equal to difference between where the dart was aimed and where the dart ends up : both 1/2gt^2. My question is that the horizontal velocity of the dart obviously increases as Alpha (the angle the dart was shot upwards at) decreases. Meaning if the dart is shot directly up (90 degrees), the dart has no horizontal velocity. So if the lower curves have a higher horizontal velocity (vcos(alpha)) as we know they do, how is it that it takes them longer to get to the SAME x position than it takes one of the trajectories with a higher alpha angle? Thanks!
You are not interpreting the problem correctly. The angle alpha is fixed as determined by the monkey's height and distance from the initial start point of the dart. It is the initial speed of the dart that may be varied, from an arbitrarily high value of unimaginable speed, to a relatively low value sufficient to reach the monkey before it hits the ground. Within this range of speed, with the dart fired at the angle alpha, the dart will always hit the monkey (ignoring air resistance of course). The lower curves have a lower horizontal velocity.