Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Dot product constrained optimization

  1. Sep 21, 2016 #1
    Problem:

    Fix some vector ##\vec{a} \in R^n \setminus \vec{0}## and define ##f( \vec{x} ) = \vec{a} \cdot \vec{x}##. Give an expression for the maximum of ##f(\vec{x})## subject to ##||\vec{x}||_2 = 1##.

    My work:

    Seems like a lagrange multiplier problem.

    I have ##\mathcal{L}(\vec{x},\lambda) = \vec{a} \cdot \vec{x} - \lambda(||x||_2 - 1)##

    Then ##D_{xi} \mathcal{L}(\vec{x},\lambda) = a_i - 1/2\lambda(\vec{x} \cdot \vec{x})^{-1/2}2x_i = a_i - \lambda x_i/||x|| = 0##. Solving for ##x_i## yields ##x_i = a_i||x||/\lambda##
    Also ##D_{\lambda} \mathcal{L}(\vec{x},\lambda) = -||x|| + 1 = 0,## so ||x|| = 1.
    Plugging that into the above expression I get ##x_i=ai/\lambda##.

    But this answer doesn't make sense to me. For one, lambda should fall out, right? Also, just thinking about it -- wouldn't we want to set ##x_i = 1## for the max ##a_i## and have all ##j\neq i, x_j = 0##, because any deviation from that would be smaller?
     
  2. jcsd
  3. Sep 21, 2016 #2

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    I think that is like killing a fly with a cannon ball, as we say. (The problem does not require such a heavy tool for its solution.)
     
  4. Sep 21, 2016 #3
    That's fair -- I gave an argument at the end not using the lagrange multiplier. I guess my question is -- why aren't those two approaches matching up?
     
  5. Sep 21, 2016 #4

    mathman

    User Avatar
    Science Advisor
    Gold Member

    The simple (second) answer is wrong. [itex]x\cdot a[/itex] is maximum, for fixed length x, when x is parallel and in the same direction as a.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Dot product constrained optimization
  1. Dot product (Replies: 3)

Loading...