Derivative of a function involving square root of sum of squares

Click For Summary
SUMMARY

The discussion focuses on deriving the function f(x) = ∑_{j=1}^n ||x - x_j||, where x is a two-dimensional vector and ||.|| denotes the Euclidean distance. The derivative of this function is calculated as df/dx = ∑_{j=1}^n (∑_{k=1}^N (x - x_j)_k) / ||x - x_j||. The conversation highlights the complexity of minimizing f(x) in multiple dimensions and suggests that minimizing the sum of squared distances leads to the barycenter, while minimizing the sum of distances is more complex. It concludes that minimizing f(x) and f(x)^2 yields equivalent results.

PREREQUISITES
  • Understanding of Euclidean distance in 2D space
  • Knowledge of calculus, specifically derivatives
  • Familiarity with minimization problems in optimization
  • Basic concepts of barycenters and least squares methods
NEXT STEPS
  • Study the properties of Euclidean norms in multi-dimensional spaces
  • Learn about optimization techniques for minimizing functions
  • Explore the relationship between least squares and distance minimization
  • Investigate barycenters and their applications in data analysis
USEFUL FOR

Mathematicians, data scientists, and anyone involved in optimization problems, particularly in the context of geometric interpretations and distance calculations in multi-dimensional spaces.

onako
Messages
86
Reaction score
0
Provided is a function f(x)=\sum_{j=1}^n ||x-x_j||, for x being a two dimensional vector, where ||.|| denotes the Euclidean distance in 2D space. How could one obtain a derivative of such a function?
 
Physics news on Phys.org
If you are just looking for a mechanical derivation I think you can do in this way. If ||\cdot|| is the Euclidean norm then:

$$||x-x_j||=\sqrt{\sum_{k=1}^N{{(x-x_j)}_k^2}}$$

where N is the dimension of the Euclidean space. So:

$$\frac{d}{dx}||x-x_j||=\frac{1}{2\sqrt{\sum_{k=1}^N{{(x-x_j)}_k^2}}}\sum_{k=1}^N{2{(x-x_j)}_k}=\frac{\sum_{k=1}^N{{(x-x_j)}_k}}{||x-x_j||}$$

and so:

$$\frac{df}{dx}=\sum_{j=1}^n{\frac{\sum_{k=1}^N{{(x-x_j)}_k}}{||x-x_j||}}$$
 
Thanks. Now, faced with the problem of minimizing f(x) for provided 2D parameters x1, x2, x3, ..., x_k, one sets the derivative to zero, and computes for x. However, in case of more than one dimension this problem is non-trivial, I think. What would be the minimizer of f(x), provided 2D parameters x1, x2, x3, ..., x_k?
 
That it will be messy can be seen by considering just 3 points in 2 dimensions. If any pair subtends an angle > 120 degrees at the third then the answer will be that third point. Otherwise, it is the point at which each pair subtends that angle.
 
Won't the minimizer be the same if you don't take the square root?
 
It then means you're squaring each term, and not the function itself. If a function is squared, then these would be equivalent.

Given a set of points in 2D, a point that minimizes the sum of squared distances to such points is the barycenter; I'm not sure about the sum of distances (so, not squared).
 
All I'm saying is that I believe

$$f(x)=\sqrt{g(x)^2}$$

has the same minimizer as

$$f(x)^2 = g(x)^2$$

I remember from basic calculus that minimizing the distance from a point to a curve is the same as minimizing the distance squared, which is a lot easier to deal with. I think that's also why least squares problems are specifically formulated the way they are. Minimizing the sum of squares is a whole lot easier than minimizing the square root of the sum of squares, and yields the same answer.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 21 ·
Replies
21
Views
3K