MHB New to Linear Algebra - LU Decomposition

pp123123
Messages
5
Reaction score
0
Just came across LU decomposition and I am not sure how to work on this problem:

Let L and L1 be invertible lower triangular matrices, and let U and U1 be invertible upper triangular matrices. Show that LU=L1U1 if and only if there exists an invertible diagonal matrix D such that L1=LD and U1=D-1U. [Hint: Scrutinize L-1L1=UU1-1]

I could work on the part till L-1L1=UU1-1, but I am not sure what I could do further. Give me some hints (and I don't actually know how to prove iff statements)?

Thankss!
 
Physics news on Phys.org
pp123123 said:
Just came across LU decomposition and I am not sure how to work on this problem:

Let L and L1 be invertible lower triangular matrices, and let U and U1 be invertible upper triangular matrices. Show that LU=L1U1 if and only if there exists an invertible diagonal matrix D such that L1=LD and U1=D-1U. [Hint: Scrutinize L-1L1=UU1-1]

I could work on the part till L-1L1=UU1-1, but I am not sure what I could do further. Give me some hints (and I don't actually know how to prove iff statements)?
In the equation $L^{-1}L_1 = UU_1^{-1}$, the left side is a lower-triangular matrix, and the right side is an upper-triangular matrix. If they are equal then they must represent a matrix that is both lower-triangular and upper-triangular. What can you say about such a matrix?

To prove an iff statement, you must show that the implication works in both directions. In this case, you first need to prove that if $LU = L_1U_1$ then there exists an invertible diagonal matrix $D$ such that $L_1 = LD$ and $U_1 = D^{-1}U$. Then you also have to prove the converse implication, namely that if there exists an invertible diagonal matrix $D$ such that $L_1 = LD$ and $U_1 = D^{-1}U$ then it follows that $LU = L_1U_1$.
 
Opalg said:
In the equation $L^{-1}L_1 = UU_1^{-1}$, the left side is a lower-triangular matrix, and the right side is an upper-triangular matrix. If they are equal then they must represent a matrix that is both lower-triangular and upper-triangular. What can you say about such a matrix?

To prove an iff statement, you must show that the implication works in both directions. In this case, you first need to prove that if $LU = L_1U_1$ then there exists an invertible diagonal matrix $D$ such that $L_1 = LD$ and $U_1 = D^{-1}U$. Then you also have to prove the converse implication, namely that if there exists an invertible diagonal matrix $D$ such that $L_1 = LD$ and $U_1 = D^{-1}U$ then it follows that $LU = L_1U_1$.

Oh I get it. So is it okay to write something like:

Due to the fact that $L^{-1}L_1 = UU_1^{-1}$
The resulting matrix must be a diagonal matrix under the circumstances that it must both be a lower-triangular and upper-triangular matrix.
Denote $D$ as the desired diagonal matrix.

$L^{-1}L_1 = UU_1^{-1} = D$
As D is a product of two invertible matrices, D must be invertible as well. Moreover,
$L^{-1}L_1 = D$
$LL^{-1}L_1=LD$
$L_1=LD$

$UU_1^{-1}U_1=DU_1$
$U=DU_1$
$D^{-1}U=D^{-1}DU_1$
$U_1=D^{-1}U$

On the other hand, given $L_1=LD$ and $U_1=D^{-1}U$,
$L_1U_1=LDD^{-1}U$
$L_1U_1=LU$

Thus, the statement is proved.

Much Thanks!
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top