F has an uniquely defined global minimum

In summary, the function $f(x)=\sum_{j=1}^k\|x-a_j\|^2$ with the euclidean norm has a uniquely defined global minimum. The minimum can be calculated by setting the gradient of $f$ equal to $0$ and solving for $x$. The Hessian matrix has only one eigenvalue, which is positive, indicating that the critical point is a local minimum.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

Let $a_1, a_2, \ldots , a_k\in \mathbb{R}^n$ and $f:\mathbb{R}^n\rightarrow \mathbb{R}$ defined by $\displaystyle{f(x)=\sum_{j=1}^k\|x-a_j\|^2}$ with the euclidean norm $\|\cdot \|$.
Show that f has an uniquely defined global minimum and calculate it. I have done the following:
\begin{equation*}f(x)=\sum_{j=1}^k\|x-a_j\|^2=\sum_{j=1}^k\sum_{i=1}^n(x_i-a_j)^2\end{equation*} The partial derivative in respect to $x_m$ is \begin{align*} f_{x_m}&=\sum_{j=1}^k2(x_m-a_j)=\sum_{j=1}^k2x_m-\sum_{j=1}^ka_j=2x_m\sum_{j=1}^k1-\sum_{j=1}^ka_j=2x_mk-\sum_{j=1}^ka_j\\ & =2kx_m-\sum_{j=1}^ka_j\end{align*}

The gradient is \begin{equation*}\text{grad}(f)=\begin{pmatrix}f_{x_1} \\ f_{x_2} \\ f_{x_3} \\ \vdots \\ f_{x_{n-1}} \\ f_{x_n}\end{pmatrix}=\begin{pmatrix}2kx_1-\displaystyle{\sum_{j=1}^ka_j}\\ 2kx_2-\displaystyle{\sum_{j=1}^ka_j} \\ 2kx_3-\displaystyle{\sum_{j=1}^ka_j} \\ \vdots \\ 2kx_{n-1}-\displaystyle{\sum_{j=1}^ka_j} \\ 2kx_n-\displaystyle{\sum_{j=1}^ka_j}\end{pmatrix}=2k\begin{pmatrix}x_1\\ x_2 \\ x_3 \\ \vdots \\ x_{n-1} \\ x_n\end{pmatrix}-\displaystyle{\sum_{j=1}^ka_j}\begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}=2kx-\displaystyle{\sum_{j=1}^ka_j}\begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}\end{equation*}

We get the critical point if we set the gradient equal to $0$: \begin{equation*}2kx-\displaystyle{\sum_{j=1}^ka_j}\begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}=\begin{pmatrix}0\\ 0 \\ 0 \\ \vdots \\ 0 \\ 0\end{pmatrix} \Rightarrow 2kx=\displaystyle{\sum_{j=1}^ka_j}\begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}\Rightarrow x=\left (\frac{1}{2k}\cdot \displaystyle{\sum_{j=1}^ka_j}\right )\cdot \begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}\end{equation*}

So, we have an extremum at $\left (\frac{1}{2k}\cdot \displaystyle{\sum_{j=1}^ka_j}\right )\cdot \begin{pmatrix}1\\ 1 \\ 1 \\ \vdots \\ 1 \\ 1\end{pmatrix}$.

The partial derivatives of second order are
\begin{align*}&f_{x_mx_m}=\frac{\partial}{\partial{x_m}}\left (2kx_m-\sum_{j=1}^ka_j\right )=2k \\ &f_{x_mx_{\ell}}=\frac{\partial}{\partial{x_{\ell}}}\left (2kx_m-\sum_{j=1}^ka_j\right )=0\end{align*}

So the Hessian-Matrix is:
\begin{equation*}H_f(x)=\begin{pmatrix}2k & 0 & 0 & \ldots & 0 & 0 \\ 0 & 2k & 0 & \ldots & 0 & 0 \\ 0 & 0 & 2k & \ldots & 0 & 0 \\ \vdots & \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & 0 & \ldots & 0 &2k & 0\\ 0 & 0 & \ldots & 0 & 0 &2k\end{pmatrix}\end{equation*}

Since we have a diagonal matrix, we get the eigenvalue from the diagonal: $\lambda=2k>0$.

Since $2k$ is positive, it follows that at the critical point we have a local minimum. Is everything correct? How can we show that this is a global minimum?
By uniquely defined is it mean that we have just one minimum?

(Wondering)
 
Physics news on Phys.org
  • #2
mathmari said:
Hey! :eek:

Let $a_1, a_2, \ldots , a_k\in \mathbb{R}^n$ and $f:\mathbb{R}^n\rightarrow \mathbb{R}$ defined by $\displaystyle{f(x)=\sum_{j=1}^k\|x-a_j\|^2}$ with the euclidean norm $\|\cdot \|$.
Show that f has an uniquely defined global minimum and calculate it. I have done the following:
\begin{equation*}f(x)=\sum_{j=1}^k\|x-a_j\|^2=\sum_{j=1}^k\sum_{i=1}^n(x_i-a_j)^2\end{equation*}

Hey mathmari!

Shouldn't that be:
\begin{equation*}f(x)=\sum_{j=1}^k\|x-a_j\|^2=\sum_{j=1}^k\sum_{i=1}^n(x_i-a_{j;i})^2\end{equation*}
(Thinking)

mathmari said:
The partial derivative in respect to $x_m$ is \begin{align*} f_{x_m}&=\sum_{j=1}^k2(x_m-a_j)=\sum_{j=1}^k2x_m-\sum_{j=1}^ka_j=2x_m\sum_{j=1}^k1-\sum_{j=1}^ka_j=2x_mk-\sum_{j=1}^ka_j\\ & =2kx_m-\sum_{j=1}^ka_j\end{align*}

That also applies here, and wasn't a factor 2 dropped?
That is, shouldn't it be:
\begin{align*} f_{x_m}&=\sum_{j=1}^k2(x_m-a_{j;m})=\sum_{j=1}^k2x_m-2\sum_{j=1}^ka_{j;m}=2x_m\sum_{j=1}^k1-2\sum_{j=1}^ka_{j;m}=2x_mk-2\sum_{j=1}^ka_{j;m}\\ & =2kx_m-2\sum_{j=1}^ka_{j;m}\end{align*}
(Thinking)
 
  • #3
I like Serena said:
Shouldn't that be:
\begin{equation*}f(x)=\sum_{j=1}^k\|x-a_j\|^2=\sum_{j=1}^k\sum_{i=1}^n(x_i-a_{j;i})^2\end{equation*}
(Thinking)
That also applies here, and wasn't a factor 2 dropped?
That is, shouldn't it be:
\begin{align*} f_{x_m}&=\sum_{j=1}^k2(x_m-a_{j;m})=\sum_{j=1}^k2x_m-2\sum_{j=1}^ka_{j;m}=2x_m\sum_{j=1}^k1-2\sum_{j=1}^ka_{j;m}=2x_mk-2\sum_{j=1}^ka_{j;m}\\ & =2kx_m-2\sum_{j=1}^ka_{j;m}\end{align*}
(Thinking)

Oh yes, you're right! (Tmi) The gradient is then equal to \begin{equation*}\text{grad}(f)=\begin{pmatrix}f_{x_1} \\ f_{x_2} \\ f_{x_3} \\ \vdots \\ f_{x_{n-1}} \\ f_{x_n}\end{pmatrix}=\begin{pmatrix}2kx_1-\displaystyle{2\sum_{j=1}^ka_{j;1}}\\ 2kx_2-\displaystyle{2\sum_{j=1}^ka_{j;2}} \\ 2kx_3-\displaystyle{2\sum_{j=1}^ka_{j;3}} \\ \vdots \\ 2kx_{n-1}-\displaystyle{2\sum_{j=1}^ka_{j;(n-1)}} \\ 2kx_n-\displaystyle{2\sum_{j=1}^ka_{j;n}}\end{pmatrix}=2k\begin{pmatrix}x_1\\ x_2 \\ x_3 \\ \vdots \\ x_{n-1} \\ x_n\end{pmatrix}-2\begin{pmatrix}\displaystyle{\sum_{j=1}^ka_{j;1}}\\ \displaystyle{\sum_{j=1}^ka_{j;2}} \\ \displaystyle{\sum_{j=1}^ka_{j;3}} \\ \vdots \\ \displaystyle{\sum_{j=1}^ka_{j;(n-1)}} \\ \displaystyle{\sum_{j=1}^ka_{j;n}}\end{pmatrix}=2kx-2\displaystyle{\sum_{j=1}^ka_{j}}\end{equation*}

For the ccritical point we set the gradient equal to $0$: \begin{equation*}2kx-2\displaystyle{\sum_{j=1}^ka_{j}}=\mathbf{0} \Rightarrow 2kx=2\displaystyle{\sum_{j=1}^ka_{j}}\Rightarrow x=\frac{1}{k}\displaystyle{\sum_{j=1}^ka_{j}}\end{equation*}

So, we have an extremum at $\frac{1}{k}\displaystyle{\sum_{j=1}^ka_{j}}$.

The partial derivatives of second order are:
\begin{align*}&f_{x_mx_m}=\frac{\partial}{\partial{x_m}}\left (2kx_m-2\sum_{j=1}^ka_{j;m}\right )=2k \\ &f_{x_mx_{\ell}}=\frac{\partial}{\partial{x_{\ell}}}\left (2kx_m-2\sum_{j=1}^ka_{j;m}\right )=0\end{align*}

The Hessian matrix is equal to
\begin{equation*}H_f(x)=\begin{pmatrix}2k & 0 & 0 & \ldots & 0 & 0 \\ 0 & 2k & 0 & \ldots & 0 & 0 \\ 0 & 0 & 2k & \ldots & 0 & 0 \\ \vdots & \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & 0 & \ldots & 0 &2k & 0\\ 0 & 0 & \ldots & 0 & 0 &2k\end{pmatrix}\end{equation*}

The only eigenvalue is $\lambda=2k>0$. So, it follows that at the critical point we have a local minimum.
Is now everything correct so far? (Wondering)
 
  • #4
mathmari said:
Is now everything correct so far? (Wondering)

Yep. All correct now. (Happy)
 
  • #5
I like Serena said:
Yep. All correct now. (Happy)

Great! (Happy)

By uniquely defined is it mean that we have just one minimum?

How can we show that the local minimum that we found is a global minimum?

(Wondering)
 
  • #6
mathmari said:
Great! (Happy)

By uniquely defined is it mean that we have just one minimum?

How can we show that the local minimum that we found is a global minimum?

(Wondering)

Yes. Since there is only 1 critical point, there is only 1 internal local extremum, which is a local minimum.
It is a global minimum if there is no lower minimum on the boundary, or, as in this case, the value of $f$ is greater than the minimum as $x\to\infty$.
What happens to $f$ if $x\to\infty$? (Wondering)
 
  • #7
I like Serena said:
Yes. Since there is only 1 critical point, there is only 1 internal local extremum, which is a local minimum.
It is a global minimum if there is no lower minimum on the boundary, or, as in this case, the value of $f$ is greater than the minimum as $x\to\infty$.
What happens to $f$ if $x\to\infty$? (Wondering)

Ah ok!

Does $f$ tend to infinity if $x$ goes to infinity? (Wondering)
 
  • #8
mathmari said:
Ah ok!

Does $f$ tend to infinity if $x$ goes to infinity?

Yes. Can you tell why? (Wondering)
 
  • #9
I like Serena said:
Yes. Can you tell why? (Wondering)

The function must tend to $+\infty$ so that we can tell that the loca minimum that we found is also a global one, or not?

We have that $$f(x)=\sum_{j=1}^k\|x-a_j\|^2$$ If $\|x\|\to \infty$ then $\|x-a_j\|\to \infty$ and so the whole right side tends to $+\infty$ because of the square, or not? (Wondering)
 
  • #10
mathmari said:
The function must tend to $+\infty$ so that we can tell that the loca minimum that we found is also a global one, or not?

We have that $$f(x)=\sum_{j=1}^k\|x-a_j\|^2$$ If $\|x\|\to \infty$ then $\|x-a_j\|\to \infty$ and so the whole right side tends to $+\infty$ because of the square, or not? (Wondering)

Yep. All correct. (Nod)
 
  • #11
I like Serena said:
Yep. All correct. (Nod)

Great! (Yes) The minimum of the function is therefore equal to \begin{equation*}f\left (\frac{1}{k}\sum_{j=1}^ka_{j}\right )=\sum_{j=1}^k\left \|\frac{1}{k}\sum_{j=1}^ka_{j}-a_j\right \|^2\end{equation*} Can we simplify this expression further? I am a bit confused with the two $a_j$'s. Only the first one is in the inner sum, isn't it? But both are in the outer sum? (Wondering)
 
  • #12
mathmari said:
Great! (Yes) The minimum of the function is therefore equal to \begin{equation*}f\left (\frac{1}{k}\sum_{j=1}^ka_{j}\right )=\sum_{j=1}^k\left \|\frac{1}{k}\sum_{j=1}^ka_{j}-a_j\right \|^2\end{equation*} Can we simplify this expression further? I am a bit confused with the two $a_j$'s. Only the first one is in the inner sum, isn't it? But both are in the outer sum? (Wondering)

Indeed. Let's use a different index variable before substituting, say $\ell$.
Then we get:
\begin{equation*}f\left (\frac{1}{k}\sum_{\ell=1}^ka_{\ell}\right )
=\sum_{j=1}^k\left \|\frac{1}{k}\sum_{\ell=1}^ka_{\ell}-a_j\right \|^2
=\sum_{j=1}^k\left \|\sum_{\ell=1}^k \left(\frac{1}{k}a_{\ell}-\delta_{j\ell}a_j\right)\right \|^2
=\sum_{j=1}^k\left \|\sum_{\ell=1}^k \left(\frac{1}{k}-\delta_{j\ell}\right)a_j\right \|^2
\end{equation*}
where $\delta_{ij}$ is the Kronecker delta, don't we? (Wondering)
 
  • #13
I like Serena said:


Indeed. Let's use a different index variable before substituting, say $\ell$.
Then we get:
\begin{equation*}f\left (\frac{1}{k}\sum_{\ell=1}^ka_{\ell}\right )
=\sum_{j=1}^k\left \|\frac{1}{k}\sum_{\ell=1}^ka_{\ell}-a_j\right \|^2
=\sum_{j=1}^k\left \|\sum_{\ell=1}^k \left(\frac{1}{k}a_{\ell}-\delta_{j\ell}a_j\right)\right \|^2
=\sum_{j=1}^k\left \|\sum_{\ell=1}^k \left(\frac{1}{k}-\delta_{j\ell}\right)a_j\right \|^2
\end{equation*}
where $\delta_{ij}$ is the Kronecker delta, don't we? (Wondering)

Ah ok! Thank you! (Mmm)
 

1. What is a "global minimum" in the context of F having a uniquely defined global minimum?

A global minimum, also known as an absolute minimum, is the lowest possible value that a function F can achieve over its entire range. In other words, it is the point at which F reaches its lowest value compared to all other points in its domain.

2. How is the global minimum of F determined?

The global minimum of F is determined by finding the critical points of the function, which are the points where the derivative of F is equal to zero. The global minimum will be the lowest critical point or the lowest value of F at the endpoints of its domain.

3. What does it mean for F to have a uniquely defined global minimum?

If F has a uniquely defined global minimum, it means that there is only one point in its domain where it reaches the lowest value. This point is the global minimum and there are no other points that can achieve a lower value.

4. Can a function have more than one global minimum?

No, a function cannot have more than one global minimum. If a function has multiple local minima, one of those points will also be the global minimum. However, a function can have multiple local minima but no global minimum if the function continues to decrease towards negative infinity.

5. Why is it important for F to have a uniquely defined global minimum?

Having a uniquely defined global minimum is important because it provides a clear understanding of the behavior of the function. It also allows for easier optimization as there is only one point to focus on for achieving the lowest value of F. Additionally, it helps to identify any potential errors or issues with the function if there are multiple global minima.

Similar threads

  • Topology and Analysis
Replies
4
Views
759
Replies
2
Views
141
Replies
32
Views
1K
Replies
12
Views
3K
  • Programming and Computer Science
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
983
  • Topology and Analysis
Replies
3
Views
1K
  • Topology and Analysis
Replies
8
Views
2K
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Back
Top