ghostbuster25
- 99
- 0
Homework Statement
ok the question is as follows. A pilot flying at 45m/s is trying to bomb a target on the ground 15m in diameter. he is at a height of 98m
(a)how long does it tke for the bomb to hit the ground?
(b)at what horizontal distance does the pilot have to be to release the bomb?
(c)what margin of error does he have in hitting the target?
Just want a check on my solutions
(a)
t=\sqrt{}2h/g will give me \sqrt{}2*98/9.8 = 4.47secs
(b)
R=u\sqrt{}2h/g will give me 45m/s\sqrt{}2*98/9.8 = 201m
(c) 45m/s / 15 = 3 seconds of error
seems resonable to me!?
If I am wrong please show me where
Thanks