A bomb is dropped from an airplane at an attitude of 14400 ft. The plane is moving at 600 miles per hour. How far will the bomb move horizontally after it is released from the plane?
I use the formula involving the distance traveled by an object with no air resistance and the Pythagorean theorem.
(1) d= vt - 0.5gt^2
(2) a^2 + b^2 = c^2
The Attempt at a Solution
1 mile = 5280 ft
g = 32 ft/sec^2
the plane's speed = 600 miles per hours = 880 ft per sec
Assuming that the airplane has no vertical velocity, I let v = 0 in the first equation above.
Then, -14400 = -(0.5)(32)(t^2)
Solving for t gives t = 30 sec.
Next, the diagonal distance from the point at which the bomb is on the ground to the plane is
To find the horizontal distance traveled by the bomb, I use the Pythagorean formula
horizontal distance = square root of (28160^2 - 14400^2) = 24200 ft = 4.58 miles
My answer is wrong. The answer key says 5 miles. What did I do wrong?
Please explain it. Thank you very much