Derivative of a function involving square root of sum of squares

Click For Summary
The discussion focuses on deriving the derivative of the function f(x) defined as the sum of Euclidean distances from a point x to a set of points in 2D space. The derivative is computed using the formula for the Euclidean norm, leading to a sum of normalized distance vectors. The challenge of minimizing f(x) is highlighted, particularly when dealing with multiple dimensions, where the minimizer can vary based on the angles subtended by points. It is noted that minimizing the sum of squared distances yields the barycenter, while the minimization of the sum of distances is more complex. The conclusion emphasizes that minimizing f(x) and its squared version leads to equivalent results, simplifying the problem significantly.
onako
Messages
86
Reaction score
0
Provided is a function f(x)=\sum_{j=1}^n ||x-x_j||, for x being a two dimensional vector, where ||.|| denotes the Euclidean distance in 2D space. How could one obtain a derivative of such a function?
 
Physics news on Phys.org
If you are just looking for a mechanical derivation I think you can do in this way. If ||\cdot|| is the Euclidean norm then:

$$||x-x_j||=\sqrt{\sum_{k=1}^N{{(x-x_j)}_k^2}}$$

where N is the dimension of the Euclidean space. So:

$$\frac{d}{dx}||x-x_j||=\frac{1}{2\sqrt{\sum_{k=1}^N{{(x-x_j)}_k^2}}}\sum_{k=1}^N{2{(x-x_j)}_k}=\frac{\sum_{k=1}^N{{(x-x_j)}_k}}{||x-x_j||}$$

and so:

$$\frac{df}{dx}=\sum_{j=1}^n{\frac{\sum_{k=1}^N{{(x-x_j)}_k}}{||x-x_j||}}$$
 
Thanks. Now, faced with the problem of minimizing f(x) for provided 2D parameters x1, x2, x3, ..., x_k, one sets the derivative to zero, and computes for x. However, in case of more than one dimension this problem is non-trivial, I think. What would be the minimizer of f(x), provided 2D parameters x1, x2, x3, ..., x_k?
 
That it will be messy can be seen by considering just 3 points in 2 dimensions. If any pair subtends an angle > 120 degrees at the third then the answer will be that third point. Otherwise, it is the point at which each pair subtends that angle.
 
Won't the minimizer be the same if you don't take the square root?
 
It then means you're squaring each term, and not the function itself. If a function is squared, then these would be equivalent.

Given a set of points in 2D, a point that minimizes the sum of squared distances to such points is the barycenter; I'm not sure about the sum of distances (so, not squared).
 
All I'm saying is that I believe

$$f(x)=\sqrt{g(x)^2}$$

has the same minimizer as

$$f(x)^2 = g(x)^2$$

I remember from basic calculus that minimizing the distance from a point to a curve is the same as minimizing the distance squared, which is a lot easier to deal with. I think that's also why least squares problems are specifically formulated the way they are. Minimizing the sum of squares is a whole lot easier than minimizing the square root of the sum of squares, and yields the same answer.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 36 ·
2
Replies
36
Views
6K