- #1
Crusaderking1
- 159
- 0
Homework Statement
According to the Guinness Book of World Records, the longest home run ever measured was hit by Roy “Dizzy” Carlyle in a minor league game. The ball traveled 188 m before landing on the ground outside the ballpark.
Assuming the ball's initial velocity was 56 degrees above the horizontal and ignoring air resistance, what did the initial speed of the ball need to be to produce such a home run if the ball was hit at a point 0.9 m above ground level? Assume that the ground was perfectly flat.
Homework Equations
Any constant acceleration equations.
The Attempt at a Solution
Ok, so this is my problem. I have all the meters, but can't find any velocities.
x = 188
y= 0ay = -9.8
ax=0
Xo = 0
Yo = 0.9
? sin 56 = Voy
? cos 56 = Vox
I don't really know what to do from here. ='(
I did write 0= Voy^2+2(-9.8)(-0.9) = 4.2 m/s = Voy, and I have a lot of values after that, but I don't think I'm doing it right.
Last edited: