1. The problem statement, all variables and given/known data A bowling ball rolls off an angled roof. The roof is sloped downward at 18.4 degrees and it is 12m above the ground. The question is how long does it take the ball to hit the ground? 2. Relevant equations Δx=viΔt+1/2a(Δt)^2 I know this is the equation I have to use but I'm having trouble isolating t. 3. The attempt at a solution I used quad formula and ended up getting Δt=1.56s but I don't think that's right. The working equation I got was Δt=(-2ghi)^(1/2)/g .I assumed Vi was 0. I don't know how to do this, help please!