1. The problem statement, all variables and given/known data How to prove (a+1/a)^2 + (b+1/b)^2 >= 25/2 given that a+b=1 and a,b positive 2. Relevant equations 3. The attempt at a solution I tried replacing b with 1-a but I get 6th degree a which I dont know how to find inequality for. Since a and b are perfect symmetric in this problem then a=b=0.5 is of interest. I get that its the minimum. I tried Lagrange multiplier but I get an equation with 3rd degree. By trial and error, I got solution. Is there any other way of finding solution? By expanding it?