Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Derivative of a function involving square root of sum of squares

  1. Oct 11, 2012 #1
    Provided is a function [tex] f(x)=\sum_{j=1}^n ||x-x_j||[/tex], for x being a two dimensional vector, where ||.|| denotes the Euclidean distance in 2D space. How could one obtain a derivative of such a function?
  2. jcsd
  3. Oct 11, 2012 #2
    If you are just looking for a mechanical derivation I think you can do in this way. If [itex]||\cdot||[/itex] is the Euclidean norm then:


    where N is the dimension of the Euclidean space. So:


    and so:

  4. Oct 12, 2012 #3
    Thanks. Now, faced with the problem of minimizing f(x) for provided 2D parameters x1, x2, x3, ..., x_k, one sets the derivative to zero, and computes for x. However, in case of more than one dimension this problem is non-trivial, I think. What would be the minimizer of f(x), provided 2D parameters x1, x2, x3, ..., x_k?
  5. Oct 12, 2012 #4


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    That it will be messy can be seen by considering just 3 points in 2 dimensions. If any pair subtends an angle > 120 degrees at the third then the answer will be that third point. Otherwise, it is the point at which each pair subtends that angle.
  6. Oct 12, 2012 #5


    User Avatar
    Homework Helper

    Won't the minimizer be the same if you don't take the square root?
  7. Oct 12, 2012 #6
    It then means you're squaring each term, and not the function itself. If a function is squared, then these would be equivalent.

    Given a set of points in 2D, a point that minimizes the sum of squared distances to such points is the barycenter; I'm not sure about the sum of distances (so, not squared).
  8. Oct 12, 2012 #7


    User Avatar
    Homework Helper

    All I'm saying is that I believe


    has the same minimizer as

    $$f(x)^2 = g(x)^2$$

    I remember from basic calculus that minimizing the distance from a point to a curve is the same as minimizing the distance squared, which is a lot easier to deal with. I think that's also why least squares problems are specifically formulated the way they are. Minimizing the sum of squares is a whole lot easier than minimizing the square root of the sum of squares, and yields the same answer.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook