- #1
ela88leo
- 1
- 0
Two archers shoot arrows in the same direction from the same place with the same initial speeds, but at different angles. One shoots at 45.0 degrees above the horizontal, while the other shoots at 60.0 degrees. If the arrow launched at 45.0 degrees lands 225m from the archer, how far apart are the two arrows when they land? assume that the arrows start at essentially ground level