1. The problem statement, all variables and given/known data A ball is thrown down with an initial velocity of 2.0 m/s from a height of 1.75 m, determine how long until it hits the ground. initial velocity = 2 m/s distance = 1.75 m acceleration = 9.8 m/s2 time = ? 2. Relevant equations d=vit+1/2a(t)2 3. The attempt at a solution 1.75=2t + 1/2(9.8)t2 1.75=2t + 4.9t2 Square root of both sides??? 1.32 = 1.4t + 2.21t 1.32 = 3.61t All divided by 3.61??? t = 0.36 .... I know the solution to be 0.43 seconds, but can't get there! Any help is greatly appreciated!