Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Optimizing fractions and Lagrange Multiplier

  1. Aug 14, 2017 #1

    joshmccraney

    User Avatar
    Gold Member

    Hi PF!

    When minimizing some fraction ##f(x)/g(x)## can we use Lagrange multipliers and say we are trying to optimize ##f## subject to the constraint ##g=1##?

    Thanks
     
  2. jcsd
  3. Aug 14, 2017 #2

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    No, I don't think so, if I understand your question correctly. Why would that be true? Consider ##f(x) = e^{-x}## and ##g(x) = 1 + x^2##.
     
  4. Aug 15, 2017 #3

    joshmccraney

    User Avatar
    Gold Member

    Yea those are good counter examples. I am curious because I am reading about the Ritz method for approximating eigenvalues. My book says: consider the eigenvalue problem ##Lu=\lambda Mu## where ##L## and ##M## are linear operators, ##u## is a function and ##\lambda## is the eigenvalue. Then the smallest eigenvalue ##\lambda_1## is given by $$\lambda_1=\min\frac{(Lu,u)}{(Mu,u)}$$ or equivalently $$\lambda_1=\min\limits_{(Mu,u)=1}(Lu,u)$$ where ##(f,g)=\int_a^bfg\,dx##.
     
  5. Aug 15, 2017 #4

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    Ah ok, yes, but this problem has a special structure. Let's write
    $$
    \lambda_1 := \min_{(Mu,u) \neq 0}\frac{(Lu,u)}{(Mu,u)}
    $$
    and
    $$
    \mu_1 := \min_{(Mu,u)=1}(Lu,u)
    $$
    Your book now claims that ##\lambda_1 = \mu_1##. How would you prove this directly (i.e. no Lagrange multipliers or something like that)?
     
  6. Aug 16, 2017 #5

    joshmccraney

    User Avatar
    Gold Member

    Take a derivative of what we're trying to minimize w.r.t...hmmm well now I'm not so sure. Any ideas?
     
  7. Aug 16, 2017 #6

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    Yes, but I would like you to puzzle a little bit, too.

    You have not given much of the precise context. Let's assume that
    • ##H## is a real Hilbert space with inner product ##(\cdot,\cdot)##,
    • ##L## and ##M## are operators with domains ##D(L) = D(M) = H##,
    • ##M## is a symmetric and positive operator, i.e. ##0 \le (Mu, u)## for all ##u \in H##.
    To be safe, I also replace your minima by infima. Let's make the minimization problems a bit more explicit using set notation. Write
    $$
    \Lambda_1 := \left\{\frac{(Lu,u)}{(Mu,u)}\,:\, u \in H,\, (Mu,u) > 0 \right\}, \qquad M_1 := \left\{(Lv,v)\,:\, v \in H,\, (Mv,v) = 1 \right\},
    $$
    so ##\lambda_1 = \inf{\Lambda_1}## and ##\mu_1 = \inf{M_1}##. (Note that the positivity of ##M## allowed me to replace the condition ##(Mu,u) \neq 0## by ##(Mu,u) > 0## in ##\Lambda_1##.)

    Now observe that ##M_1 \subseteq \Lambda_1##. (This already gives ##\lambda_1 \le \mu_1##.) With a little bit (but not much) more work, you also show the reverse inclusion:
    $$
    M_1 \supseteq \Lambda_1 \qquad (\ast)
    $$
    Once this is done, you have ##\Lambda_1 = M_1##, so ##\lambda_1 = \mu_1## follows. If you like, try to deduce ##(\ast)## yourself.

    (The essential property you will use, is that both the numerator and the denominator of the original function
    $$
    u \mapsto \frac{(Lu,u)}{(Mu,u)}
    $$
    are homogeneous of degree two, so any scaling of ##u## does not change the function value.)
     
  8. Aug 16, 2017 #7

    joshmccraney

    User Avatar
    Gold Member

    Ok, so I'm thinking if the denominator was ##(u,u)## (a less general case) then all we would have to do is let ##v = u/||u||##. But with the ##M## operator present, I'm unsure how to proceed...I'll think more on this, but feel free to give the spoiler.

    On a separate note, I'm interested in this entire process of finding the lowest eigenvalues because I am trying to solve ##Lu=\lambda Mu##. If we instead look at a simpler problem ##Lu=\lambda u## I know the lowest eigenvalue is $$\lambda_1=\inf \frac{(Lu,u)}{(u,u)}.$$ Letting $$u=\sum_{i=1}^N a_i\psi_i$$ where ##\psi_i## are orthonormal basis vectors that satisfy boundary conditions, we deduce
    $$(Lu,u) = \sum_{i,k=1}^NF_{ik}a_ia_k:F_{ik}\equiv (L\psi_i,\psi_k)
    $$
    To find the infimum from above, we can reduce the minimizing problem to minimizing ##(Lu,u)## subject to the constraint ##(u,u)=1##. This evokes Lagrange multipliers. Letting ##\Lambda## be the Lagrange multiplier, take $$\frac{\partial}{\partial a_j}\left( \sum_{i,k=1}^NF_{ik}a_ia_k + \Lambda \sum_{i,k=1}^Na_ia_k(\psi_i,\psi_k)\right)=0\implies\\
    \frac{\partial}{\partial a_j}\left( \sum_{i,k=1}^NF_{ik}a_ia_k + \Lambda \sum_{i}^Na_i^2\right)=0\implies\\
    \sum_{k=1}^NF_{ki}a_k + \Lambda a_i=0\implies\\
    \det(F_{ki}-\Lambda\delta_{ki})=0.
    $$
    Similarly, when solving the problem ##Lu=\lambda M u##, if we let ##(Mu,u) = \sum_{i,k=1}^ND_{ik}a_ia_k:D_{ik}\equiv (M\psi_i,\psi_k)## then the solution should be $$\det(F_{ki}-\Lambda D_{ki})=0$$ where ##\lambda_1## is the lowest root ##\Lambda##. Does this look correct to you?

    I'm under the impression this technique is called the Rayleigh-Ritz variational approach.
     
  9. Aug 17, 2017 #8

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    The symmetry and - in particular - the positivity of the operator ##M## imply that ##(u_1,u_2) \mapsto (Mu_1,u_2)## defines a bilinear form on ##H \times H## that has all the properties of an inner product, with the exception that ##(Mu,u) = 0## may not imply ##u = 0##. In any case, for the problem at hand you can just let
    $$
    v =\frac{1}{\sqrt{(Mu,u)}}u
    $$
    (In other words, you scale by ##(Mu,u)^{-\frac{1}{2}}## instead of ##(u,u)^{-\frac{1}{2}}##.)
     
  10. Aug 17, 2017 #9

    joshmccraney

    User Avatar
    Gold Member

  11. Aug 17, 2017 #10

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Id say, if ##f,g## are differentiable, you can use the constraint ## f'g-fg'=g^2 ## when ##g \neq 0 ; g(x) \neq 0 ## or not defined , since extremes are reached at critical points.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Optimizing fractions and Lagrange Multiplier
  1. Lagrange Multipliers (Replies: 10)

  2. Lagrange multiplier (Replies: 1)

  3. Lagrange multiplier (Replies: 6)

Loading...