# The decomposition for a symmetric positiv definite matrix is unique

• MHB
• mathmari
In summary, the conversation discusses the Cholesky decomposition of a symmetric positive definite matrix. It is proven that the decomposition is unique if the diagonal entries of the lower triangular matrix are required to be positive. The conversation also mentions the possibility of using an LU decomposition, but it is not clear how it relates to the Cholesky decomposition. The proof of the uniqueness of the Cholesky decomposition is discussed, as well as the necessary conditions for this decomposition to exist.

#### mathmari

Gold Member
MHB
Hey!

We have the matrix \begin{equation*}A=\begin{pmatrix}1/2 & 1/5 & 1/10 & 1/17 \\ 1/5 & 1/2 & 1/5 & 1/10 \\ 1/10 & 1/5 & 1/2 & 1/5 \\ 1/17 & 1/10 & 1/5 & 1/10\end{pmatrix}\end{equation*}

I have applied the Cholesky decomposition and found that $A=\tilde{L}\cdot \tilde{L}^T$ where
\begin{equation*}\tilde{L}=\begin{pmatrix}\frac{1}{\sqrt{2}} & 0 & 0 & 0 \\ \frac{\sqrt{2}}{5} & \sqrt{\frac{21}{50}} & 0 & 0 \\ \frac{\sqrt{2}}{10} & \frac{4\sqrt{42}}{105} & \frac{2\sqrt{1155}}{105} & 0 \\ \frac{\sqrt{2}}{17} & \frac{13}{17\sqrt{42}} & \frac{142}{17\sqrt{1155}} & \sqrt{\frac{298}{15895}}\end{pmatrix} \ \ \ \text{ und } \ \ \ \tilde{L}^T=\begin{pmatrix}\frac{1}{\sqrt{2}} & \frac{\sqrt{2}}{5} & \frac{\sqrt{2}}{10} & \frac{\sqrt{2}}{17} \\ 0 & \sqrt{\frac{21}{50}} & \frac{4\sqrt{42}}{105} & \frac{13}{17\sqrt{42}} \\ 0 & 0 & \frac{2\sqrt{1155}}{105} & \frac{142}{17\sqrt{1155}} \\ 0 & 0 & 0 & \sqrt{\frac{298}{15895}} \end{pmatrix}\end{equation*} Is that correct? (Wondering) I want to show that the decomposition for a symmetric positiv definite matrix is unique and it is given as a hint that we have to use the LU decomposition.

Could you explain to me how we can do that? (Wondering)

mathmari said:
Is that correct? I want to show that the decomposition for a symmetric positiv definite matrix is unique and it is given as a hint that we have to use the LU decomposition.

Could you explain to me how we can do that?

Hey mathmari! (Wave)

I checked a couple of the entries and they were correct.

Suppose we check for every entry in $L$ what the possibilities are, starting from the top left.
Which options do we have for $l_{11}$?
After that, which options do we have for $l_{21}$? (Wondering)

Klaas van Aarsen said:
Suppose we check for every entry in $L$ what the possibilities are, starting from the top left.
Which options do we have for $l_{11}$?
After that, which options do we have for $l_{21}$? (Wondering)

How can we do that? I got stuck right now. (Wondering)

mathmari said:
How can we do that? I got stuck right now.

We have:
$$LL^T=\begin{pmatrix}l_{11}&0&0&\cdots\\*&*&0&\cdots\\\cdots\end{pmatrix} \begin{pmatrix}l_{11}&*&\cdots\\0&*&\cdots\\0&0&\cdots\\\cdots\end{pmatrix} =\begin{pmatrix}a_{11}&*&\cdots\\*&*&\cdots\\\cdots\end{pmatrix}$$
So we must have that $l_{11}^2=a_{11}$, don't we? (Wondering)

Consequently we must have $l_{11}=\pm\sqrt{a_{11}}$.
Hmm... there are 2 options, aren't there?
Did we have any further restrictions on $L$? Are the diagonal entries perhaps required to be positive?
Because otherwise the decomposition is not unique. (Doh)

Klaas van Aarsen said:
We have:
$$LL^T=\begin{pmatrix}l_{11}&0&0&\cdots\\*&*&0&\cdots\\\cdots\end{pmatrix} \begin{pmatrix}l_{11}&*&\cdots\\0&*&\cdots\\0&0&\cdots\\\cdots\end{pmatrix} =\begin{pmatrix}a_{11}&*&\cdots\\*&*&\cdots\\\cdots\end{pmatrix}$$
So we must have that $l_{11}^2=a_{11}$, don't we? (Wondering)

Consequently we must have $l_{11}=\pm\sqrt{a_{11}}$.
Hmm... there are 2 options, aren't there?
Did we have any further restrictions on $L$? Are the diagonal entries perhaps required to be positive?
Because otherwise the decomposition is not unique. (Doh)

For the Cholesky decomposition or the LU decomposition? (Wondering)

mathmari said:
For the Cholesky decomposition or the LU decomposition? (Wondering)

To be honest, it's not clear to me yet why we have the hint about using an LU decomposition.
It seems to me we don't need it, but perhaps I've missed something and it'll become clear later. (Thinking)

Either way, I meant for the Cholesky decomposition.
Note that $L\cdot L^T=(-L)\cdot(-L)^T$, so there are 2 decompositions for sure, unless we set a restriction on $L$.
And indeed, a Cholesky decomposition is only unique if we have positive entries on the diagonal of the $L$ matrix. See the wiki page. (Nerd)

Klaas van Aarsen said:
To be honest, it's not clear to me yet why we have the hint about using an LU decomposition.
It seems to me we don't need it, but perhaps I've missed something and it'll become clear later. (Thinking)

Either way, I meant for the Cholesky decomposition.
Note that $L\cdot L^T=(-L)\cdot(-L)^T$, so there are 2 decompositions for sure, unless we set a restriction on $L$.
And indeed, a Cholesky decomposition is only unique if we have positive entries on the diagonal of the $L$ matrix. See the wiki page. (Nerd)

We have the following:

Let $A$ be symmetric. Then $A = LR = A^T = R^TL^T$. Since $LR = R^TL^T$ for symmetric matrices, we want a decomposition $A=\tilde{L}\tilde{L}^T$, where $\tilde{L}$ is a lower triangular matrix that has not necessarily $1$ on the diagonal or $A=LDL^T$, where $D$ is a diagonal matrix and $L$ is lower triangular matrix with ones on the diagonal. $D$ has then only positive entries and we get $\tilde{L}=LD$. Does this help use? (Wondering)

mathmari said:
We have the following:

Let $A$ be symmetric. Then $A = LR = A^T = R^TL^T$. Since $LR = R^TL^T$ for symmetric matrices, we want a decomposition $A=\tilde{L}\tilde{L}^T$, where $\tilde{L}$ is a lower triangular matrix that has not necessarily $1$ on the diagonal or $A=LDL^T$, where $D$ is a diagonal matrix and $L$ is lower triangular matrix with ones on the diagonal. $D$ has then only positive entries and we get $\tilde{L}=LD$.

Does this help us?

Well, at least that formulation seems to imply that $\tilde L$ has positive entries on its diagonal.
That's because $L$ is defined to have $1$'s on its diagonal, and $D$ has positive entries on its diagonal. (Thinking)

Btw, I think that should be $\tilde{L}=LD^{1/2}$, because otherwise the equality $A=\tilde{L}\tilde{L}^T$ does not hold. (Nerd)Is your mind set to use the hint of LU decomposition?
From that text, it seems as if there are already some properties known about LU decompositions.
Such as that a symmetric $A$ is guaranteed to have an $LR$ decomposition.
Can we use that? (Wondering)

I think we want to prove that there exists an $LL^T$ decomposition, and that it is unique (up to a sign).

To prove it exist, we can follow the drift of that text and formalize/prove it.
If we have an $LR$ decomposition, we can extract a diagonal matrix $D$ can't we?
That is, I think we can get $A=L' D R'$ where $L'$ and $R'$ have $1$'s on the diagonal.
And if $A$ is positive definite, then $D$ must have a positive diagonal, mustn't it?
Afterwards we can deduce that $R'$ must be equal to $L'^T$ giving us the $L'DL'^T$ form.
And finally we can construct $\tilde L=LD^{1/2}$. (Thinking)

To prove it is unique, we can do what I already suggested.

Klaas van Aarsen said:
Btw, I think that should be $\tilde{L}=LD^{1/2}$, because otherwise the equality $A=\tilde{L}\tilde{L}^T$ does not hold. (Nerd)

Ah yes (Malthe)
Klaas van Aarsen said:
Is your mind set to use the hint of LU decomposition?

If there were no hint, how would we show that? (Wondering)
Klaas van Aarsen said:
I think we want to prove that there exists an $LL^T$ decomposition, and that it is unique (up to a sign).

To prove it exist, we can follow the drift of that text and formalize/prove it.
If we have an $LR$ decomposition, we can extract a diagonal matrix $D$ can't we?
That is, I think we can get $A=L' D R'$ where $L'$ and $R'$ have $1$'s on the diagonal.
And if $A$ is positive definite, then $D$ must have a positive diagonal, mustn't it?
Afterwards we can deduce that $R'$ must be equal to $L'^T$ giving us the $L'DL'^T$ form.
And finally we can construct $\tilde L=LD^{1/2}$. (Thinking)

I see (Nerd)
Klaas van Aarsen said:
To prove it is unique, we can do what I already suggested.

You mean to show that $\ell_{11}$ is unique and then the next elements that depend on this one are also unique? (Wondering)

mathmari said:
If there were no hint, how would we show that?

You mean to show that $\ell_{11}$ is unique and then the next elements that depend on this one are also unique?

Yes.
It proves uniqueness, which holds if there exists at least one decomposition.
To prove existence, I think we can prove it if we use that the determinant of every principle minor must be positive. (Thinking)

Klaas van Aarsen said:
Yes.
It proves uniqueness, which holds if there exists at least one decomposition.
To prove existence, I think we can prove it if we use that the determinant of every principle minor must be positive. (Thinking)

I got stuck right now. So do we not need the hint of LU decomposition? (Wondering)

Folks, this thread has been running for a while now. I'd like to suggest a much shorter proof.

re existence of Cholesky decomposition: it is guaranteed by the existence of a square root of $A$, which we can call $B$. Now run QR factorization on $B$. This gives the proof of existence. It also suggests one way of proving uniqueness -- prove the uniqueness of $R$ when doing QR factorization on an invertible matrix. There are many proofs of this though they are all a bit granular.

A better approach for uniqueness you are trying to prove uniqueness and are working with invertible matrices -- explicitly invert them in the proof.

So to prove uniqueness of cholesky for a real symmetric $A \succ 0$, suppose for a contradiction that you find (at least) two distinct decompositions

$A = LL^T = SS^T$

where $L$ and $S$ are Lower Triangular but $L \neq S$. For avoidance of doubt all matrices here have real scalars and are $n$ x $n$ and there is a constraint that the diagonal entries of the triangular matrices are positive.

so take the equation $A = LL^T = SS^T$
then multiply on the left by $L^{-1}$ and the right by $(L^T)^{-1}$,

This implies that $Q:=\big(L^{-1}S\big)^{T}$ is orthogonal and triangular which gives the contradiction. Why?

steep said:
So to prove uniqueness of cholesky for a real symmetric $A \succ 0$, suppose for a contradiction that you find (at least) two distinct decompositions

$A = LL^T = SS^T$

where $L$ and $S$ are Lower Triangular but $L \neq S$. For avoidance of doubt all matrices here have real scalars and are $n$ x $n$ and there is a constraint that the diagonal entries of the triangular matrices are positive.

so take the equation $A = LL^T = SS^T$
then multiply on the left by $L^{-1}$ and the right by $(L^T)^{-1}$,

This implies that $Q:=\big(L^{-1}S\big)^{T}$ is orthogonal and triangular which gives the contradiction. Why?

Is an orthogonal and triangular matrix the idetity matrix? If this holds then we get $L=S$ and so we get a contradiction, right? (Wondering) So, at the proof we don't need the LR decomposition, do we? (Wondering)

mathmari said:
Is an orthogonal and triangular matrix the idetity matrix? If this holds then we get $L=S$ and so we get a contradiction, right? (Wondering) So, at the proof we don't need the LR decomposition, do we? (Wondering)

right about LR not being needed with this approach.

To complete this proof you'd need to fill in the blanks as to how you know $Q$ is orthogonal and triangular and why that implies it is diagonal with each entry being on the unit circle (this is where the constraint that $L$ and $S$ have only positive entries on their respective diagonals kicks in -- because that forces $D$ to become the identity matrix).

with respect to $Q$ being necessarily diagonal there are several ways to prove this... one is a very natural inductive proof, another makes uses of the fact that orthogonal matrices have a determinant magnitude of 1 and special properties of determinants for triangular matrices. Some of the ideas are closely related to trying to applying Schur Triangularization to a normal matrix. I will be curious to see which approach you take...