View Single Post
adam.kumayl
#1
Feb3-13, 06:37 PM
P: 6
In the Monkey-Dart experiment (where a monkey lets go of a vine exactly the same time a dart is shot at him), the dart is known to reach the monkey because the distance from the vine to where he ends up is equal to difference between where the dart was aimed and where the dart ends up : both 1/2gt^2.

My question is that the horizontal velocity of the dart obviously increases as Alpha (the angle the dart was shot upwards at) decreases. Meaning if the dart is shot directly up (90 degrees), the dart has no horizontal velocity. So if the lower curves have a higher horizontal velocity (vcos(alpha)) as we know they do, how is it that it takes them longer to get to the SAME x position than it takes one of the trajectories with a higher alpha angle?

Thanks!
Attached Thumbnails
Monkey Dart.png  
Phys.Org News Partner Physics news on Phys.org
Physicists discuss quantum pigeonhole principle
First in-situ images of void collapse in explosives
The first supercomputer simulations of 'spin?orbit' forces between neutrons and protons in an atomic nucleus