# Homework Help: Trying to prove inequality with Lagrange multipliers

1. Mar 14, 2006

### xman

Show that if we have N positive numbers
$$\left[ p_{i}\right]_{i=1}^{N}$$
such that
$$\sum_{i} p_{i} =1$$
then for any N numbers
$$\left\{x_{i}\right\}_{i=1}^{N}$$
we have the inequality
$$\prod_{i=1}^{N} x_{i}^{2 p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2}$$

So I am thinking to show the inequality is true using Lagrange multipliers first take the set
$$W = \sum_{i} p_{i}x_{i}^{2}$$
and we want to minimize above subject to constraint
$$S = \prod_{i} x_{i}^{2p_{i}}$$
so we form the function
$$f^{\star} = f + \lambda g \Rightarrow f^{\star} =\sum_{i} p_{i}x_{i}^{2}+\lambda \left(S-\prod_{i} x_{i}^{2p_{i}}\right)$$
So I think everything so far is ok...my question is how do you differentiate an infinite series and an infinite product. Also in this case is the Lagrange multiplier a single value $$\lambda$$ or is there one multiplier for each value of i , that is; do I need a $$\lambda_{i}$$ Any direction or input is greatly appreciated.

2. Mar 14, 2006

### StatusX

I'm a little confused by some of your wording, but I think what you're trying to do is show that, for a fixed set of pi, and a fixed value of S, W is always greater than or equal to S, whatever the xi that give this S may be. If you show this is true for any S, then for a given set of xi, these give rise to some S, and then you know the corresponding W for these xi must be greater than or equal to S. That's an interesting approach. I'm not sure if it'll work, but it's worth a try.

You'll only need one $\lambda$, because there's only one constraint equation. But you need to differentiate with respect to each xi. Namely, you have:

$$\frac{\partial}{\partial x_i} \left( f + \lambda g \right) = 0$$

for all i from 1 to N.

3. Mar 14, 2006

### xman

Sorry if I wasn't a little more clear, but that's exactly what I'm trying to show. I thought this would be a fun problem, and I am not really familiar with Lagrange multipliers, so I thought I would try proving it, this way. So, only one common multiplier, great. Now is this correct
$$\frac{\partial}{\partial x_{j}} \left(p_{i} x_{i}^{2}\right)= 2p_{i}x_{i} \frac{\partial x_{i}}{\partial x_{j}} \delta_{ij}$$
where $$\delta_{ij}$$ is the Kronecker delta of course. Now for the infinite product I'm not sure.

4. Mar 14, 2006

### StatusX

First off, you want:

$$\frac{\partial x_i}{\partial x_j}= \delta_{ij}$$

since the xi are independent. And, just for the record, neither the product nor the sum are infinite, they both range from 1 to N.

To differentiate the product, just write it out. All the terms that don't involve xi will be constants when you differentiate with respect to xi. You will end up with the original product times some prefactor.

5. Mar 14, 2006

### xman

Right, I meant to write
$$\frac{\partial x_{i}}{\partial x_{j}}=\delta_{ij}$$
Sorry I keep saying "infinite " for product and sum. Ok, so is this correct
$$2 \sum_{i=1}^{N}p_{i} x_{i}+ \lambda \left(S-2 \prod_{i=1}^{N} p_{i}x_{i}^{2p_{i}-1} \right) =0$$
Does this seems reasonable?

6. Mar 14, 2006

### StatusX

No, you only want to differentiate with respect to one xi at a time. You'll get N different equations.

7. Mar 14, 2006

### xman

Sorry, I'm a little uncomfortable with these differentiation rules. Here it goes so I should get
$$2 p_{j} x_{j}+ \lambda \left( \frac{\partial S}{\partial x_{j}}-2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i <j} x_{i}^{2p_{i}}\right)\right)=0$$
where $$1<j<i<N$$

8. Mar 14, 2006

### StatusX

Closer. If you replaced i<j by i≠j, you'd just about have it, although there's a more convenient way to write it (in terms of S). And remember, S is a constant.

9. Mar 14, 2006

### xman

Great so it would be something like$$\ldots$$

$$2 p_{j} x_{j}+ \lambda \left( -2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i \neq j} x_{i}^{2p_{i}}\right)\right)=0$$

From here we solve for $$\lambda$$ with the requirement that we want to minimize, right?

10. Mar 14, 2006

### StatusX

Well, if you rewrite the last term in terms of S, like I suggested, you'll see that the equation is the same for every xi. What does this tell you? (You don't need to know lambda)

11. Mar 14, 2006

### xman

Oh snap are you saying something along the lines of
$$\frac{\partial f^{\star}}{\partial x_{1}} = 2 p_{1} x_{1} - 2\lambda p_{1}x_{1}^{2p_{1}-1}\prod_{i=2}^{N} x_{i}^{2p_{i}}=0 \cr \ldots \\ \frac{\partial f^{\star}}{\partial x_{N}} = 2p_{N} x_{N} -2\lambda p_{N}x_{N}^{2p_{N}-1} \prod_{i=1}^{N-1} x_{i}^{2p_{i}}=0 \\$$
So
$$\Rightarrow \frac{\partial f^{\star}}{\partial x_{1}} = p_{1} x_{1} -\lambda p_{1}x_{1}^{-1}S=0 \ldots \frac{\partial f^{\star}}{\partial x_{N}} = p_{N} x_{N} -\lambda p_{N}x_{N}^{-1}S \\ \Rightarrow \sum_{i} p_{i} x_{i}^{2} = \lambda \left(\sum_{i}p_{i}\right)S$$
Thus
$$\Rightarrow S^{-1} \sum_{i} p_{i} x_{i}^{2} = \lambda$$

Last edited: Mar 14, 2006
12. Mar 14, 2006

### StatusX

I was with you up till the last line. Try writing an expression for each xi in terms of only lambda and S. The exact equation isn't important, what is important is that it is the same for all xi, which means... (the key point).

13. Mar 14, 2006

### xman

So we have
$$p_{1} x_{1}=\lambda \frac{p_{1}S}{x_{1}} \ldots p_{N}x_{N}= \lambda \frac{p_{N} S}{x_{N}}$$
which simplifies to
$$x_{1} = \lambda \frac{S}{x_{1}} \ldots x_{N} = \lambda \frac{S}{x_{N}} \Rightarrow x_{1}^{2}=\lambda S\ldots x_{N}^{2}=\lambda S$$
Right? I guess I'm obviously missing the key point here, the only thing that comes to mind now is
$$\lambda = \frac{x_{1}^{2}}{S} = \ldots =\frac{x_{N}^{2}}{S}$$
So
$$x_{1}^{2}=\ldots = x_{N}^{2}$$
So
$$\prod_{i=1}^{N} \left(x_{i}^{2}\right)^{p_{i}} \Rightarrow \left(x_{N}^{2}\right)^{\sum_{i}p_{i}} = x_{N}^{2}$$
since we are given that
$$\sum_{i} p_{i} =1$$
Am I heading down the wrong path again?

14. Mar 14, 2006

### StatusX

Right, you needed to show they were all equal. What this means is that W is at an extreme value (max or min) when all of the xi are equal. Now just calculate what S are W are in this case, and verify the inequality. You then need to verify this is actually the minimum of W.

15. Mar 15, 2006

### xman

Awesome, so I want to show that W is a minimum here. Hence,
$$W(x^{c}) \leq W(x^{c}+\epsilon) \Rightarrow x^{2} \left(\sum_{i=1}^{N}p_{i}\right) \leq \left(x+\epsilon\right)^{2} \left(\sum_{i=1}^{N} p_{i}\right)$$
Yielding
$$0\leq \epsilon \left(2x+\epsilon\right)$$
which shows us that for points to the left we have a negative slope and for points to the right we have a positive slope since $$\epsilon >0$$ the sign of the equation is dominated by the x-which indeed is a minimum. Now for the inequality it suffices to show
$$\prod_{i=1}^{N} x_{i}^{2p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2} \mid_{x=x^{c}}$$
which immediately reduces to equality as
$$x^{2}=x^{2}$$
per my previous post. Finally, we conclude since W is a minimum, and we have equality at the critical point $$x^{c} = x_{1}=\cdots=x_{n}$$ then this is indeed an upper bound for our S and therefore the inequality is true.

Is there any points I missed in the wrap up here?