View Single Post
adam.kumayl
#1
Feb3-13, 06:37 PM
P: 6
In the Monkey-Dart experiment (where a monkey lets go of a vine exactly the same time a dart is shot at him), the dart is known to reach the monkey because the distance from the vine to where he ends up is equal to difference between where the dart was aimed and where the dart ends up : both 1/2gt^2.

My question is that the horizontal velocity of the dart obviously increases as Alpha (the angle the dart was shot upwards at) decreases. Meaning if the dart is shot directly up (90 degrees), the dart has no horizontal velocity. So if the lower curves have a higher horizontal velocity (vcos(alpha)) as we know they do, how is it that it takes them longer to get to the SAME x position than it takes one of the trajectories with a higher alpha angle?

Thanks!
Attached Thumbnails
Monkey Dart.png  
Phys.Org News Partner Physics news on Phys.org
Researchers demonstrate ultra low-field nuclear magnetic resonance using Earth's magnetic field
Bubbling down: Discovery suggests surprising uses for common bubbles
New non-metallic metamaterial enables team to 'compress' and contain light