1. The problem statement, all variables and given/known data A ball is dropped from a balloon going up at a speed of 7 m/s. If the balloon was at a height 60 m at the time of dropping the ball, how long will the ball take to reach the ground? Ans:4.3 seconds 2. Relevant equations 3. The attempt at a solution I can't understand the question. Is the initial velocity of the ball when it is dropped=0? or is it 7m/s. What value should I take for g? Should I take g= 9.8 m/s[itex]^2[/itex]? Please help me. I assumed ball is dropped from 60 m height with initial velocity=0 and a=g=9.8 m/s[itex]^2[/itex]. But it gives wrong answer. What is the concept of this question? How to solve it?