1. The problem statement, all variables and given/known data A projectile is launched at 10m/s from a sloped surface. The surface is angled 30degrees, and the projectile is launched off from the surface at a 45degrees angle relative to the horizon. Find the distance from the launching point where the projectile lands. How long does it take to reach this point? 2. Relevant equations v = 10 m/s 3. The attempt at a solution I've tried solving this by resolving the angled plane as the horizon, giving the projectile angled at 75degrees, but I don't think this is the right way to proceed, the parabolic shape would not be preserved by this, and the length from origin would be wrong. Thanks for the help, I'm really struggling how to approach this problem.