- #1
sroy2
- 2
- 0
Homework Statement
A ball is kicked off of a hill with an initial horizontal velocity of 10 m/sec.
The pitch of the hill from where the ball is thrown is -20°. Determine how far the ball lands from its original position.
Not given, but I'm acting under the assumption that gravity = -9.81m/sec²
This is how I see the problem,
http://img78.imageshack.us/img78/6411/projectileeu0.gif
Homework Equations
a = v/t
d = vo*t + ½at²
The Attempt at a Solution
I would love to say that;
a = v/t
9.81 = 10/t
t = 1.02sec
Then just split and solve so that
d(x) = 10.2, d(y) = -5.1
√(dx² + dy²) = 11.4m
But I can't do that because there is no y velocity so, solving for the only constant.
d = vo*t + ½at²
Split into x,y.
d(y) = -4.905 t²
t = √(d(y)/- 4.905)
d(x) = vo(x)*t
t = d(x)/10
Setting them equal,
d(x)/10 = √(d(y)/- 4.905)
dx = 10√(dy/- 4.905)
dy = - 4.905(dx/10)²
To find the distance from start use law cosines, or in this case pythag.
d = √[{10√(dy/- 4.905)}² + {- 4.905(dx/10)²}²]
While this is nice, it isn't getting me any closer to the answer. I need to solve for time somehow...
So... If anyone can give me a nudge it would be extremely grateful.
Last edited by a moderator: