I got this problem for physics and i cant seem to get it, it goes like: A projectile is launched at an angel of 40 degrees with an initial velocity of 100 m/s. One hundred meters away is the beginning of a hill that slopes upward at an angle of 20 degrees. The projectile strikes the hill a distance of L up the slope. What is the value of this distance up the slope? Okay so far ive gotten: H | a=0 | Vi=76.6 | x=100 | t= 1.31 s V | a=-9.81 | Vi=64.28 | x=75.79 | thats all ive gotten, but i cant figure out how to c where it lands. Maybe someone can give me a hint or sometin im really stuck. Thanks You.