Prove: "a√(b/c)+b√(c/a)+c√(a/b)≥1

  • Thread starter Thread starter Andrax
  • Start date Start date
Andrax
Messages
117
Reaction score
0

Homework Statement


let a b c real numbers \in ]0,+infini[
we have a+b+c=1 and a \geq b \geq c
Prove that a\sqrt{\frac{b}{c}}+b\sqrt{\frac{c}{a}}+c\sqrt{\frac{a}{b}} \geq 1

The Attempt at a Solution


i tried the following
x\sqrt{abc} : ab\sqrt{a}+bc\sqrt{b}+ac\sqrt{c}\geq \sqrt{abc}
Also saying that a bc from ]0,+infini[ and a+b+c=1 means that a and b and c \leq 1
i tried the inequality forgot it's name \sqrt{abc}+1\geq\sqrt{abc}*(a+b+c) but it dosen't give the wanted result well I've been working on this exercises for 4 days in a row this is an olympiad exercise btw so yeah help will be appreciated .
 
Last edited:
Physics news on Phys.org
Andrax said:

Homework Statement


let a b c real numbers \in ]0,+infini[
we have a+b+c=1 and a \geq b \geq c
Prove that a\sqrt{\frac{b}{c}}+b\sqrt{\frac{c}{a}}+c\sqrt{\frac{a}{c}} \geq 1



The Attempt at a Solution


i tried the following
x\sqrt{abc} : ab\sqrt{a}+bc\sqrt{b}+ac\sqrt{c}\geq \sqrt{abc}
Also saying that a bc from ]0,+infini[ and a+b+c=1 means that a and b and c \leq 1
i tried the inequality forgot it's name \sqrt{abc}+1\geq\sqrt{abc}*(a+b+c) but it dosen't give the wanted result well I've been working on this exercises for 4 days in a row this is an olympiad exercise btw so yeah help will be appreciated .

I assume you've got a typo there and you're meant to prove that ## a \sqrt{\frac{b}{c}}+b \sqrt{\frac{c}{a}}+c \sqrt{\frac{a}{b}} \geq 1 ##

And 1 is not a possible value for a,b,c; we can say ##1 \gt a\geq b\geq c\gt 0##

I don't know the answer, but have you tried starting from somewhere else, eg. ##(\sqrt a + \sqrt b + \sqrt c)^2##?
 
Joffan said:
I assume you've got a typo there and you're meant to prove that ## a \sqrt{\frac{b}{c}}+b \sqrt{\frac{c}{a}}+c \sqrt{\frac{a}{b}} \geq 1 ##

And 1 is not a possible value for a,b,c; we can say ##1 \gt a\geq b\geq c\gt 0##

I don't know the answer, but have you tried starting from somewhere else, eg. ##(\sqrt a + \sqrt b + \sqrt c)^2##?
yes i had a typo thanks , hm intersting i'll try that thanks (probably won't work anyway)
 
didn't work :/ I'm stuck lol
 
I think I managed to get it into the form "show c2(bc-a2)+a2(ca-b2)+b2(ab-c2) >= 0 for all positive a, b, c". Looks better, but don't see where to go from there.
 
Andrax said:

Homework Statement


let a b c real numbers \in ]0,+infini[
we have a+b+c=1 and a \geq b \geq c
Prove that a\sqrt{\frac{b}{c}}+b\sqrt{\frac{c}{a}}+c\sqrt{\frac{a}{b}} \geq 1



The Attempt at a Solution


i tried the following
x\sqrt{abc} : ab\sqrt{a}+bc\sqrt{b}+ac\sqrt{c}\geq \sqrt{abc}
Also saying that a bc from ]0,+infini[ and a+b+c=1 means that a and b and c \leq 1
i tried the inequality forgot it's name \sqrt{abc}+1\geq\sqrt{abc}*(a+b+c) but it dosen't give the wanted result well I've been working on this exercises for 4 days in a row this is an olympiad exercise btw so yeah help will be appreciated .

This problem is solvable via constrained optimization. If
f(a,b,c) = a \sqrt{\frac{b}{c}} + b \sqrt{\frac{c}{a}} + c \sqrt{\frac{a}{b}} and ## g(a,b,c) = a+b+c##, we can look at the problem
\min f(a,b,c)\\<br /> \text{subject to } \\<br /> g(a,b,c) = 1\\<br /> a \geq b \\<br /> b \geq c .
If the minimal value of f is ≥ 1 we are done. Note that f , g and all the inequality constraint functions are homogeneous of degree 1, so if (a,b,c) solves the problem with g = m, then (a/m,b/m,c/m) solves it with g = 1. Therefore, we might as well fix c = 1, then solve the problem of minimizing F(a,b) = f(a,b,1) subject to the conditions a ≥ b ≥ 1. We can show that the point (a,b) = (1,1) satisfies the Karush-Kuhn-Tucker conditions for this problem so is a (local) constrained min. To show it is the global min requires a bit more work.

Let ##F_a = \partial F / \partial a##, and note that
F_a = \frac{N}{D}, \: N = 2 a^2 b -\sqrt{a}b^{3/2}+a^{3/2}, \; D = 2 a^2 \sqrt{b},
so ##\text{sign}(F_a) = \text{sign}(N)##. We want to show that N > 0 on ##X = \{a \geq b \geq 1\},## so look at ##N_a = \partial N /\partial a = N_1 /D_1,## where ##D_1 = 2 \sqrt{a}## and ##N_1 = 3a + 8 a^{3/2} b - b^{3/2}.## By taking a derivative again it is not hard to show that the minimum of ##N_1## in region X is > 0, so ##F_a > 0## in X. That is, F(a,b) is strictly increasing in a, for a ≥ b. Therefore, the constrained minimum of F must occur along the line a = b, so it is enough to look at ##F(b,b) = 1 + \sqrt{b} + b^{3/2},## which we want to minimize for b ≥ 1. The optimal solution is at (a,b) = (1,1), hence we have the optimum (a,b,c) = (1,1,1). Now we just need to re-scale to (a,b,c) = (1/3,1/3,1/3) to get the minimum of f for g = 1.
 
Last edited:
Ray Vickson said:
This problem is solvable via constrained optimization. If
f(a,b,c) = a \sqrt{\frac{b}{c}} + b \sqrt{\frac{c}{a}} + c \sqrt{\frac{a}{b}} and ## g(a,b,c) = a+b+c##, we can look at the problem
\min f(a,b,c)\\<br /> \text{subject to } \\<br /> g(a,b,c) = 1\\<br /> a \geq b \\<br /> b \geq c .
If the minimal value of f is ≥ 1 we are done. Note that f , g and all the inequality constraint functions are homogeneous of degree 1, so if (a,b,c) solves the problem with g = m, then (a/m,b/m,c/m) solves it with g = 1. Therefore, we might as well fix c = 1, then solve the problem of minimizing F(a,b) = f(a,b,1) subject to the conditions a ≥ b ≥ 1. We can show that the point (a,b) = (1,1) satisfies the Karush-Kuhn-Tucker conditions for this problem so is a (local) constrained min. To show it is the global min requires a bit more work.

Let ##F_a = \partial F / \partial a##, and note that
F_a = \frac{N}{D}, \: N = 2 a^2 b -\sqrt{a}b^{3/2}+a^{3/2}, \; D = 2 a^2 \sqrt{b},
so ##\text{sign}(F_a) = \text{sign}(N)##. We want to show that N > 0 on ##X = \{a \geq b \geq 1\},## so look at ##N_a = \partial N /\partial a = N_1 /D_1,## where ##D_1 = 2 \sqrt{a}## and ##N_1 = 3a + 8 a^{3/2} b - b^{3/2}.## By taking a derivative again it is not hard to show that the minimum of ##N_1## in region X is > 0, so ##F_a > 0## in X. That is, F(a,b) is strictly increasing in a, for a ≥ b. Therefore, the constrained minimum of F must occur along the line a = b, so it is enough to look at ##F(b,b) = 1 + \sqrt{b} + b^{3/2},## which we want to minimize for b ≥ 1. The optimal solution is at (a,b) = (1,1), hence we have the optimum (a,b,c) = (1,1,1). Now we just need to re-scale to (a,b,c) = (1/3,1/3,1/3) to get the minimum of f for g = 1.

thank you
 
Last edited:
Ray's is an impressive approach, but I'm not convinced about this step:
Ray Vickson said:
Note that f , g and all the inequality constraint functions are homogeneous of degree 1, so if (a,b,c) solves the problem with g = m, then (a/m,b/m,c/m) solves it with g = 1. Therefore, we might as well fix c = 1, then solve the problem of minimizing F(a,b) = f(a,b,1) subject to the conditions a ≥ b ≥ 1.
It seems to me that we are not minimizing for a particular m - rather we are allowing m to vary. The fact that a and b end up at their constraints does nothing to dispel this uneasiness.
 
Joffan said:
Ray's is an impressive approach, but I'm not convinced about this step:

It seems to me that we are not minimizing for a particular m - rather we are allowing m to vary. The fact that a and b end up at their constraints does nothing to dispel this uneasiness.

Neglecting the sum constraint is OK (see below). Perhaps a more damaging objection is that by first setting c = 1, then solving the (a,b) problem (without a sum constraint) we can lose the optimum. That does not happen, but it is probably better to work with the full 3-variable problem. We can solve for (a,b,c) without a sum constraint by recognizing that
\min f(a,b,c) = a b^{1/2}c^{-1/2} + b c^{1/2}a^{-1/2} + c a^{1/2} b^{-1/2} \\<br /> \text{subject to}\\<br /> g_1(a,b,c) \equiv a^{-1} b \leq 1\\<br /> c \geq 1<br />
is a posynomial geometric programming problem--that is, f and g_1 posynomials--so any Kuhn-Tucker point is a global optimum, despite the lack of convexity. (Essentially, this follows by writing a = exp(u), b = exp(v) and c = exp(w), and noting that we have a convex programming problem in the variables u, v and w.) It is easy to verify that (a,b,c) = (1,1,1) is a Kuhn-Tucker point. In addition it satisfies the missing constraint b >= c, hence is *also a global solution when that constraint is added to the problem*.

Now: about the sum constraint. Without that constraint we get a solution that satisfies sum = 3, so it is also the optimal solution of the problem when the constraint sum = 3 is added to the problem. In other words we have f(1,1,1) < f(a,b,c) for all (a,b,c) that satisfy the two inequality constraints and the sum constraint a+b+c = 3. Thus, f(1/3,1/3,1/3)= (1/3)f(1,1,1) < (1/3)f(a,b,c) = f(a',b',c') for any (a',b',c') = (a/3,b/3,c/3) that satisfy the constraint a'+b'+c'=1,

Finally, for definitions and properties of posynomials, see, eg.,
http://en.wikipedia.org/wiki/Posynomial
or
https://inst.eecs.berkeley.edu/~ee127a/book/login/l_gp_posy.html
 
Last edited by a moderator:
Back
Top