roam
- 1,265
- 12
Hi
Here's my probelm:
A baseball is thrown with an initial velocity of 100 m/s at an angle of 30° above the horizontal. How far from the throwing point will the baseball attain its original level?
The attempt at a solution:
vix = 100 cos 30° => 86.6 m/s
viy = 100 cos 30° => 50 m/s
To find the distance we need to use v = \frac{d}{t} = d = vt
Now in order to find the time we can use;
y = v_{iy}t + \frac{1}{2}a_{y}t^2
y = 0 since the ball is coming back to its original height.
= 0 = (50 m/s) . t + \frac{1}{2} (-9.8 m/s^2 ) t^2
Am I right...? I can't exactly see how to obtain the time out of this…
Thanks
Here's my probelm:
A baseball is thrown with an initial velocity of 100 m/s at an angle of 30° above the horizontal. How far from the throwing point will the baseball attain its original level?
The attempt at a solution:
vix = 100 cos 30° => 86.6 m/s
viy = 100 cos 30° => 50 m/s
To find the distance we need to use v = \frac{d}{t} = d = vt
Now in order to find the time we can use;
y = v_{iy}t + \frac{1}{2}a_{y}t^2
y = 0 since the ball is coming back to its original height.
= 0 = (50 m/s) . t + \frac{1}{2} (-9.8 m/s^2 ) t^2
Am I right...? I can't exactly see how to obtain the time out of this…
Thanks
Last edited: