Trying to prove inequality with Lagrange multipliers

Click For Summary
SUMMARY

The forum discussion focuses on proving the inequality \(\prod_{i=1}^{N} x_{i}^{2 p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2}\) using Lagrange multipliers. The participants clarify that only one Lagrange multiplier, \(\lambda\), is needed due to the single constraint equation. They discuss differentiating the function \(f^{\star} = \sum_{i} p_{i}x_{i}^{2} + \lambda \left(S - \prod_{i} x_{i}^{2p_{i}}\right)\) with respect to each \(x_{i}\) and conclude that the inequality holds when all \(x_{i}\) are equal, confirming that \(W\) reaches a minimum at this point.

PREREQUISITES
  • Understanding of Lagrange multipliers
  • Familiarity with differentiation of products and sums
  • Knowledge of Kronecker delta notation
  • Basic concepts of inequalities in mathematical analysis
NEXT STEPS
  • Study the application of Lagrange multipliers in optimization problems
  • Learn about differentiation techniques for products and sums
  • Explore the properties of inequalities in mathematical analysis
  • Investigate the implications of equality conditions in optimization
USEFUL FOR

Mathematicians, students studying optimization techniques, and anyone interested in advanced calculus and inequality proofs.

xman
Messages
92
Reaction score
0
Show that if we have N positive numbers
\left[ p_{i}\right]_{i=1}^{N}
such that
\sum_{i} p_{i} =1
then for any N numbers
\left\{x_{i}\right\}_{i=1}^{N}
we have the inequality
\prod_{i=1}^{N} x_{i}^{2 p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2}

So I am thinking to show the inequality is true using Lagrange multipliers first take the set
W = \sum_{i} p_{i}x_{i}^{2}
and we want to minimize above subject to constraint
S = \prod_{i} x_{i}^{2p_{i}}
so we form the function
f^{\star} = f + \lambda g \Rightarrow f^{\star} =\sum_{i} p_{i}x_{i}^{2}+\lambda \left(S-\prod_{i} x_{i}^{2p_{i}}\right)
So I think everything so far is ok...my question is how do you differentiate an infinite series and an infinite product. Also in this case is the Lagrange multiplier a single value \lambda or is there one multiplier for each value of i , that is; do I need a \lambda_{i} Any direction or input is greatly appreciated.
 
Physics news on Phys.org
I'm a little confused by some of your wording, but I think what you're trying to do is show that, for a fixed set of pi, and a fixed value of S, W is always greater than or equal to S, whatever the xi that give this S may be. If you show this is true for any S, then for a given set of xi, these give rise to some S, and then you know the corresponding W for these xi must be greater than or equal to S. That's an interesting approach. I'm not sure if it'll work, but it's worth a try.

You'll only need one \lambda, because there's only one constraint equation. But you need to differentiate with respect to each xi. Namely, you have:

\frac{\partial}{\partial x_i} \left( f + \lambda g \right) = 0

for all i from 1 to N.
 
Thanks for replying StatusX,
Sorry if I wasn't a little more clear, but that's exactly what I'm trying to show. I thought this would be a fun problem, and I am not really familiar with Lagrange multipliers, so I thought I would try proving it, this way. So, only one common multiplier, great. Now is this correct
\frac{\partial}{\partial x_{j}} \left(p_{i} x_{i}^{2}\right)= 2p_{i}x_{i} \frac{\partial x_{i}}{\partial x_{j}} \delta_{ij}
where \delta_{ij} is the Kronecker delta of course. Now for the infinite product I'm not sure.
 
First off, you want:

\frac{\partial x_i}{\partial x_j}= \delta_{ij}

since the xi are independent. And, just for the record, neither the product nor the sum are infinite, they both range from 1 to N.

To differentiate the product, just write it out. All the terms that don't involve xi will be constants when you differentiate with respect to xi. You will end up with the original product times some prefactor.
 
StatusX said:
First off, you want:

\frac{\partial x_i}{\partial x_j}= \delta_{ij}

since the xi are independent. And, just for the record, neither the product nor the sum are infinite, they both range from 1 to N.

To differentiate the product, just write it out. All the terms that don't involve xi will be constants when you differentiate with respect to xi. You will end up with the original product times some prefactor.

Right, I meant to write
\frac{\partial x_{i}}{\partial x_{j}}=\delta_{ij}
Sorry I keep saying "infinite " for product and sum. Ok, so is this correct
2 \sum_{i=1}^{N}p_{i} x_{i}+ \lambda \left(S-2 \prod_{i=1}^{N} p_{i}x_{i}^{2p_{i}-1} \right) =0
Does this seems reasonable?
 
No, you only want to differentiate with respect to one xi at a time. You'll get N different equations.
 
StatusX said:
No, you only want to differentiate with respect to one xi at a time. You'll get N different equations.

Sorry, I'm a little uncomfortable with these differentiation rules. Here it goes so I should get
2 p_{j} x_{j}+ \lambda \left( \frac{\partial S}{\partial x_{j}}-2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i <j} x_{i}^{2p_{i}}\right)\right)=0
where 1<j<i<N
 
Closer. If you replaced i<j by i≠j, you'd just about have it, although there's a more convenient way to write it (in terms of S). And remember, S is a constant.
 
Great so it would be something like\ldots

2 p_{j} x_{j}+ \lambda \left( -2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i \neq j} x_{i}^{2p_{i}}\right)\right)=0

From here we solve for \lambda with the requirement that we want to minimize, right?
 
  • #10
Well, if you rewrite the last term in terms of S, like I suggested, you'll see that the equation is the same for every xi. What does this tell you? (You don't need to know lambda)
 
  • #11
Oh snap are you saying something along the lines of
<br /> \frac{\partial f^{\star}}{\partial x_{1}} = 2 p_{1} x_{1}<br /> - 2\lambda p_{1}x_{1}^{2p_{1}-1}\prod_{i=2}^{N} x_{i}^{2p_{i}}=0 \cr<br /> \ldots \\<br /> \frac{\partial f^{\star}}{\partial x_{N}} = 2p_{N} x_{N}<br /> -2\lambda p_{N}x_{N}^{2p_{N}-1} \prod_{i=1}^{N-1}<br /> x_{i}^{2p_{i}}=0 \\<br />
So
<br /> \Rightarrow<br /> \frac{\partial f^{\star}}{\partial x_{1}} = p_{1} x_{1}<br /> -\lambda p_{1}x_{1}^{-1}S=0<br /> \ldots<br /> \frac{\partial f^{\star}}{\partial x_{N}} = p_{N} x_{N}<br /> -\lambda p_{N}x_{N}^{-1}S<br /> \\<br /> \Rightarrow<br /> \sum_{i} p_{i} x_{i}^{2} = \lambda \left(\sum_{i}p_{i}\right)S<br />
Thus
<br /> \Rightarrow<br /> S^{-1} \sum_{i} p_{i} x_{i}^{2} = \lambda<br />
 
Last edited:
  • #12
I was with you up till the last line. Try writing an expression for each xi in terms of only lambda and S. The exact equation isn't important, what is important is that it is the same for all xi, which means... (the key point).
 
  • #13
So we have
p_{1} x_{1}=\lambda \frac{p_{1}S}{x_{1}} \ldots p_{N}x_{N}= \lambda<br /> \frac{p_{N} S}{x_{N}}
which simplifies to
x_{1} = \lambda \frac{S}{x_{1}} \ldots<br /> x_{N} = \lambda \frac{S}{x_{N}} \Rightarrow x_{1}^{2}=\lambda<br /> S\ldots x_{N}^{2}=\lambda S
Right? I guess I'm obviously missing the key point here, the only thing that comes to mind now is
\lambda = \frac{x_{1}^{2}}{S} = \ldots =\frac{x_{N}^{2}}{S}
So
x_{1}^{2}=\ldots = x_{N}^{2}
So
<br /> \prod_{i=1}^{N} \left(x_{i}^{2}\right)^{p_{i}} \Rightarrow<br /> \left(x_{N}^{2}\right)^{\sum_{i}p_{i}} = x_{N}^{2}
since we are given that
\sum_{i} p_{i} =1
Am I heading down the wrong path again?
 
  • #14
Right, you needed to show they were all equal. What this means is that W is at an extreme value (max or min) when all of the xi are equal. Now just calculate what S are W are in this case, and verify the inequality. You then need to verify this is actually the minimum of W.
 
  • #15
StatusX said:
Right, you needed to show they were all equal. What this means is that W is at an extreme value (max or min) when all of the xi are equal. Now just calculate what S are W are in this case, and verify the inequality. You then need to verify this is actually the minimum of W.
Awesome, so I want to show that W is a minimum here. Hence,
W(x^{c}) \leq W(x^{c}+\epsilon) \Rightarrow<br /> x^{2} \left(\sum_{i=1}^{N}p_{i}\right) \leq \left(x+\epsilon\right)^{2} \left(\sum_{i=1}^{N} p_{i}\right)
Yielding
0\leq \epsilon \left(2x+\epsilon\right)
which shows us that for points to the left we have a negative slope and for points to the right we have a positive slope since \epsilon &gt;0 the sign of the equation is dominated by the x-which indeed is a minimum. Now for the inequality it suffices to show
\prod_{i=1}^{N} x_{i}^{2p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2} \mid_{x=x^{c}}
which immediately reduces to equality as
x^{2}=x^{2}
per my previous post. Finally, we conclude since W is a minimum, and we have equality at the critical point x^{c} = x_{1}=\cdots=x_{n} then this is indeed an upper bound for our S and therefore the inequality is true.

Is there any points I missed in the wrap up here?
 

Similar threads

Replies
6
Views
2K
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
659
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 17 ·
Replies
17
Views
2K