1. Jun 13, 2010

### ghostbuster25

1. The problem statement, all variables and given/known data
ok the question is as follows. A pilot flying at 45m/s is trying to bomb a target on the ground 15m in diameter. he is at a height of 98m
(a)how long does it tke for the bomb to hit the ground?
(b)at what horizontal distance does the pilot have to be to release the bomb?
(c)what margin of error does he have in hitting the target?

Just want a check on my solutions
(a)
t=$$\sqrt{}2h/g$$ will give me $$\sqrt{}2*98/9.8$$ = 4.47secs

(b)
R=u$$\sqrt{}2h/g$$ will give me 45m/s$$\sqrt{}2*98/9.8$$ = 201m

(c) 45m/s / 15 = 3 seconds of error

seems resonable to me!?

If im wrong please show me where

Thanks

2. Jun 13, 2010

### kuruman

Parts a and b are OK. Your answer to part c does not look right. Are you saying that he can release the bomb 3 seconds late and still hit the target? In three seconds he travels 135 m and that's way too much. What you need is the time required to travel the length of the target, i.e. 15 m. If he travels farther than that, he misses.

3. Jun 13, 2010

### ghostbuster25

ah yer that makes more sense, 15m/45m/s = 0.3333333 s