1. The problem statement, all variables and given/known data A military helicopter on a training mission is flying horizontally at a speed of 60.0 m/s and accidentally drops a bomb (fortunately not armed) at an elevation of 300m. You can ignore air resistance. How much time is required for the bomb to reach the earth? How far does it travel horizontally while falling? Find the horizontal and vertical component of its velocity just before it strikes the earth. 2. Relevant equations Velocity Final = Velocity Initial + Acceleration(Time) X Final - X Initial = Time/2 (Velocity Initial + Velocity Final) X Final - X Initial = Velocity Initial (Time) + .5 (Acceleration)(Time)^2 Velocity Final^2 = Velocity Initial ^2 + 2 ( Acceleration)(X Final - X Initial) 3. The attempt at a solution I want to really understand this, so I will be breaking up the work into parts. As usually the result of the previous question help solve the one after it. For the first question: How much time is required for the bomb to reach the earth? The knowns: Final Position @ 300 and Initial @ 0. Acceleration @ -9.8. Velocity Initial @ 60. The unknown: Time Based on the values provided, the following equation seems suited to the problem. X Final - X Initial = Velocity Initial (Time) + .5 (Acceleration)(Time)^2 0 - 300 = 60(t) + .5 (-9.8)(t^2) -4.9t^2 +60t +300 = 0 -60 +- 97.365 / -9.8 t = 16.06 or t = -3.81 This incorrect. How would I have to approach this problem.