(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

According to the Guinness Book of World Records, the longest home run ever measured was hit by Roy “Dizzy” Carlyle in a minor league game. The ball traveled 188 m (618 ft ) before landing on the ground outside the ballpark.

Assuming the ball's initial velocity was 52 ∘ above the horizontal and ignoring air resistance, what did the initial speed of the ball need to be to produce such a home run if the ball was hit at a point 0.9 m (3.0 ft ) above ground level? Assume that the ground was perfectly flat.

2. Relevant equations

ΔX=V_{i}cos(θ)T

ΔY=V_{i}sin(θ)T+.5aT^{2}

V_{x}=V_{i}cos(θ)T

V_{y}=V_{i}sin(θ)+aT

3. The attempt at a solution

I know the final velocity in the Y direction will be zero and the final position in the Y direction will also be zero. If I could solve for how long the baseball is in the air I could use the second equation I listed and solve for the initial velocity since the accelration is equal to -9.8m/s^{2}. I'm not completely sure of how to go about solving this problem and I feel like there's something I'm over looking. Any suggestions?

**Physics Forums - The Fusion of Science and Community**

# Initial velocity

Know someone interested in this topic? Share a link to this question via email,
Google+,
Twitter, or
Facebook

- Similar discussions for: Initial velocity

Loading...

**Physics Forums - The Fusion of Science and Community**