1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Trying to prove inequality with Lagrange multipliers

  1. Mar 14, 2006 #1
    Show that if we have N positive numbers
    [tex] \left[ p_{i}\right]_{i=1}^{N} [/tex]
    such that
    [tex] \sum_{i} p_{i} =1 [/tex]
    then for any N numbers
    [tex] \left\{x_{i}\right\}_{i=1}^{N} [/tex]
    we have the inequality
    [tex] \prod_{i=1}^{N} x_{i}^{2 p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2} [/tex]

    So I am thinking to show the inequality is true using Lagrange multipliers first take the set
    [tex] W = \sum_{i} p_{i}x_{i}^{2} [/tex]
    and we want to minimize above subject to constraint
    [tex] S = \prod_{i} x_{i}^{2p_{i}} [/tex]
    so we form the function
    [tex] f^{\star} = f + \lambda g \Rightarrow f^{\star} =\sum_{i} p_{i}x_{i}^{2}+\lambda \left(S-\prod_{i} x_{i}^{2p_{i}}\right) [/tex]
    So I think everything so far is ok...my question is how do you differentiate an infinite series and an infinite product. Also in this case is the Lagrange multiplier a single value [tex]\lambda[/tex] or is there one multiplier for each value of i , that is; do I need a [tex] \lambda_{i}[/tex] Any direction or input is greatly appreciated.
     
  2. jcsd
  3. Mar 14, 2006 #2

    StatusX

    User Avatar
    Homework Helper

    I'm a little confused by some of your wording, but I think what you're trying to do is show that, for a fixed set of pi, and a fixed value of S, W is always greater than or equal to S, whatever the xi that give this S may be. If you show this is true for any S, then for a given set of xi, these give rise to some S, and then you know the corresponding W for these xi must be greater than or equal to S. That's an interesting approach. I'm not sure if it'll work, but it's worth a try.

    You'll only need one [itex]\lambda[/itex], because there's only one constraint equation. But you need to differentiate with respect to each xi. Namely, you have:

    [tex]\frac{\partial}{\partial x_i} \left( f + \lambda g \right) = 0[/tex]

    for all i from 1 to N.
     
  4. Mar 14, 2006 #3
    Thanks for replying StatusX,
    Sorry if I wasn't a little more clear, but that's exactly what I'm trying to show. I thought this would be a fun problem, and I am not really familiar with Lagrange multipliers, so I thought I would try proving it, this way. So, only one common multiplier, great. Now is this correct
    [tex] \frac{\partial}{\partial x_{j}} \left(p_{i} x_{i}^{2}\right)= 2p_{i}x_{i} \frac{\partial x_{i}}{\partial x_{j}} \delta_{ij} [/tex]
    where [tex] \delta_{ij} [/tex] is the Kronecker delta of course. Now for the infinite product I'm not sure.
     
  5. Mar 14, 2006 #4

    StatusX

    User Avatar
    Homework Helper

    First off, you want:

    [tex]\frac{\partial x_i}{\partial x_j}= \delta_{ij}[/tex]

    since the xi are independent. And, just for the record, neither the product nor the sum are infinite, they both range from 1 to N.

    To differentiate the product, just write it out. All the terms that don't involve xi will be constants when you differentiate with respect to xi. You will end up with the original product times some prefactor.
     
  6. Mar 14, 2006 #5
    Right, I meant to write
    [tex] \frac{\partial x_{i}}{\partial x_{j}}=\delta_{ij} [/tex]
    Sorry I keep saying "infinite " for product and sum. Ok, so is this correct
    [tex] 2 \sum_{i=1}^{N}p_{i} x_{i}+ \lambda \left(S-2 \prod_{i=1}^{N} p_{i}x_{i}^{2p_{i}-1} \right) =0 [/tex]
    Does this seems reasonable?
     
  7. Mar 14, 2006 #6

    StatusX

    User Avatar
    Homework Helper

    No, you only want to differentiate with respect to one xi at a time. You'll get N different equations.
     
  8. Mar 14, 2006 #7
    Sorry, I'm a little uncomfortable with these differentiation rules. Here it goes so I should get
    [tex] 2 p_{j} x_{j}+ \lambda \left( \frac{\partial S}{\partial x_{j}}-2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i <j} x_{i}^{2p_{i}}\right)\right)=0 [/tex]
    where [tex] 1<j<i<N [/tex]
     
  9. Mar 14, 2006 #8

    StatusX

    User Avatar
    Homework Helper

    Closer. If you replaced i<j by i≠j, you'd just about have it, although there's a more convenient way to write it (in terms of S). And remember, S is a constant.
     
  10. Mar 14, 2006 #9
    Great so it would be something like[tex] \ldots [/tex]

    [tex] 2 p_{j} x_{j}+ \lambda \left( -2 p_{j} x_{j}^{2p_{j}-1} \left( \prod_{i \neq j} x_{i}^{2p_{i}}\right)\right)=0 [/tex]

    From here we solve for [tex] \lambda [/tex] with the requirement that we want to minimize, right?
     
  11. Mar 14, 2006 #10

    StatusX

    User Avatar
    Homework Helper

    Well, if you rewrite the last term in terms of S, like I suggested, you'll see that the equation is the same for every xi. What does this tell you? (You don't need to know lambda)
     
  12. Mar 14, 2006 #11
    Oh snap are you saying something along the lines of
    [tex]
    \frac{\partial f^{\star}}{\partial x_{1}} = 2 p_{1} x_{1}
    - 2\lambda p_{1}x_{1}^{2p_{1}-1}\prod_{i=2}^{N} x_{i}^{2p_{i}}=0 \cr
    \ldots \\
    \frac{\partial f^{\star}}{\partial x_{N}} = 2p_{N} x_{N}
    -2\lambda p_{N}x_{N}^{2p_{N}-1} \prod_{i=1}^{N-1}
    x_{i}^{2p_{i}}=0 \\
    [/tex]
    So
    [tex]
    \Rightarrow
    \frac{\partial f^{\star}}{\partial x_{1}} = p_{1} x_{1}
    -\lambda p_{1}x_{1}^{-1}S=0
    \ldots
    \frac{\partial f^{\star}}{\partial x_{N}} = p_{N} x_{N}
    -\lambda p_{N}x_{N}^{-1}S
    \\
    \Rightarrow
    \sum_{i} p_{i} x_{i}^{2} = \lambda \left(\sum_{i}p_{i}\right)S
    [/tex]
    Thus
    [tex]
    \Rightarrow
    S^{-1} \sum_{i} p_{i} x_{i}^{2} = \lambda
    [/tex]
     
    Last edited: Mar 14, 2006
  13. Mar 14, 2006 #12

    StatusX

    User Avatar
    Homework Helper

    I was with you up till the last line. Try writing an expression for each xi in terms of only lambda and S. The exact equation isn't important, what is important is that it is the same for all xi, which means... (the key point).
     
  14. Mar 14, 2006 #13
    So we have
    [tex] p_{1} x_{1}=\lambda \frac{p_{1}S}{x_{1}} \ldots p_{N}x_{N}= \lambda
    \frac{p_{N} S}{x_{N}} [/tex]
    which simplifies to
    [tex] x_{1} = \lambda \frac{S}{x_{1}} \ldots
    x_{N} = \lambda \frac{S}{x_{N}} \Rightarrow x_{1}^{2}=\lambda
    S\ldots x_{N}^{2}=\lambda S [/tex]
    Right? I guess I'm obviously missing the key point here, the only thing that comes to mind now is
    [tex] \lambda = \frac{x_{1}^{2}}{S} = \ldots =\frac{x_{N}^{2}}{S} [/tex]
    So
    [tex] x_{1}^{2}=\ldots = x_{N}^{2} [/tex]
    So
    [tex]
    \prod_{i=1}^{N} \left(x_{i}^{2}\right)^{p_{i}} \Rightarrow
    \left(x_{N}^{2}\right)^{\sum_{i}p_{i}} = x_{N}^{2} [/tex]
    since we are given that
    [tex] \sum_{i} p_{i} =1 [/tex]
    Am I heading down the wrong path again?
     
  15. Mar 14, 2006 #14

    StatusX

    User Avatar
    Homework Helper

    Right, you needed to show they were all equal. What this means is that W is at an extreme value (max or min) when all of the xi are equal. Now just calculate what S are W are in this case, and verify the inequality. You then need to verify this is actually the minimum of W.
     
  16. Mar 15, 2006 #15
    Awesome, so I want to show that W is a minimum here. Hence,
    [tex] W(x^{c}) \leq W(x^{c}+\epsilon) \Rightarrow
    x^{2} \left(\sum_{i=1}^{N}p_{i}\right) \leq \left(x+\epsilon\right)^{2} \left(\sum_{i=1}^{N} p_{i}\right) [/tex]
    Yielding
    [tex] 0\leq \epsilon \left(2x+\epsilon\right) [/tex]
    which shows us that for points to the left we have a negative slope and for points to the right we have a positive slope since [tex] \epsilon >0 [/tex] the sign of the equation is dominated by the x-which indeed is a minimum. Now for the inequality it suffices to show
    [tex] \prod_{i=1}^{N} x_{i}^{2p_{i}} \leq \sum_{i=1}^{N} p_{i}x_{i}^{2} \mid_{x=x^{c}} [/tex]
    which immediately reduces to equality as
    [tex] x^{2}=x^{2} [/tex]
    per my previous post. Finally, we conclude since W is a minimum, and we have equality at the critical point [tex] x^{c} = x_{1}=\cdots=x_{n}[/tex] then this is indeed an upper bound for our S and therefore the inequality is true.

    Is there any points I missed in the wrap up here?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Trying to prove inequality with Lagrange multipliers
  1. Lagrange Multipliers (Replies: 8)

Loading...