Hi! I was given this physics problem and I'm trying to solve it, but I'm not sure if I'm doing it correctly... I have to write a formula to calculate the _minimal_ speed (I missed this part at first) at which a projectile must be shot to pass through point (x, y). My first step was to write a formula to calculate the exact speed at which a projectile must be shot to pass through point (x, y) when shot at an angle 'a'. This is what I get: s = (sqrt(g) * x) / (sqrt(2 * (x * tan(a) - y)) * cos(a)) (g >= 0, a != 90, a > arctan(y / x)) I'm quite sure that it is correct. Now I need to figure out how to find the formula for the _minimal_ speed. I'm quite sure that it depends on the angle. sqrt(2 * (x * tan(a) - y)) * cos(a)) The bigger this part gets, the smaller the speed gets. '2', 'x' and 'y' are constant, so it must be 'a', right? So I need to find the angle at which both 'tan(a)' and 'cos(a)' give the biggest values. How do I do that? Am I thinking correctly? Does any of this actually make any sense?