- #1

- 1

- 0

## Homework Statement

According to the Guinness Book of World Records, the longest home run ever measured was hit by Roy “Dizzy” Carlyle in a minor league game. The ball traveled 188 m (618 ft ) before landing on the ground outside the ballpark.

Assuming the ball's initial velocity was 52 ∘ above the horizontal and ignoring air resistance, what did the initial speed of the ball need to be to produce such a home run if the ball was hit at a point 0.9 m (3.0 ft ) above ground level? Assume that the ground was perfectly flat.

## Homework Equations

ΔX=V

_{i}cos(θ)T

ΔY=V

_{i}sin(θ)T+.5aT

^{2}

V

_{x}=V

_{i}cos(θ)T

V

_{y}=V

_{i}sin(θ)+aT

## The Attempt at a Solution

I know the final velocity in the Y direction will be zero and the final position in the Y direction will also be zero. If I could solve for how long the baseball is in the air I could use the second equation I listed and solve for the initial velocity since the accelration is equal to -9.8m/s

^{2}. I'm not completely sure of how to go about solving this problem and I feel like there's something I'm over looking. Any suggestions?