An object is launched from the origin with a velocity of 20 m/s at an angle of 30 degrees above the x axis. The object lands on a roof that is 4 meters high. What is the time when the object hits the roof? a) 1.11 s b) 1.27 s c) 1.33 s d) 1.49 s e) 1.67 s I did 20 (sin(30)) = 10 m/s. Then used the equation vf = vi + at 0 = 10 - 9.8t t = 1.02 s What am I doing wrong.