MHB Best method for LU decomposition

Jameson
Insights Author
Gold Member
MHB
Messages
4,533
Reaction score
13
Hi all,

I realize there might not be a "best method" but I want to ask if anyone has any improvements to the method taught in my class. I've looked at the Wikipedia page on this already.

Let's use the matrix [math]\left( \begin{array}{ccc} 2 & -1 & 2 \\ -6 & 0 & -2 \\ 8 & -1 & 5 \end{array} \right)[/math].

My teacher said to get the upper triangular matrix, $U$, we should simply find an echelon form of this matrix.

The quickest one for this appears to be [math]\left( \begin{array}{ccc} 2 & -1 & 2 \\ 0 & -3 & 4 \\ 0 & 0 & 1 \end{array} \right)[/math].

Now we need to find the lower triangular matrix, $L$. The way I was told to do this is by setting up another matrix in the following form: [math]\left( \begin{array}{ccc} 1 & 0 & 0 \\ a & 1 & 0 \\ b & c & 1 \end{array} \right)[/math].

Finally if we use the fact that [math]\left( \begin{array}{ccc} 1 & 0 & 0 \\ a & 1 & 0 \\ b & c & 1 \end{array} \right) \left( \begin{array}{ccc} 2 & -1 & 2 \\ 0 & -3 & 4 \\ 0 & 0 & 1 \end{array} \right) = \left( \begin{array}{ccc} 2 & -1 & 2 \\ -6 & 0 & -2 \\ 8 & -1 & 5 \end{array} \right)[/math]

then we should be able to solve for $a,b,c$. Is there a better method that anything uses or is this one just fine?

Thank you!
 
Physics news on Phys.org
Cholesky works faster, but is only applicable to Hermitian positive matrices. LU is used mostly for when you have multiple right-hand-sides of the equation $Ax=b$ to solve. If you have a single, straight equation $Ax=b$, then nothing, absolutely nothing, beats Gaussian elimination with back substitution (GEBS) for "exact" methods. I put exact in quotes because if very large systems, GEBS has worse round-off error than many of the iterative methods, so some of the iterative methods give you more accurate results. In particular, there are some very nice algorithms for sparse matrices that can be useful.
 
Ackbach said:
Cholesky works faster, but is only applicable to Hermitian positive matrices. LU is used mostly for when you have multiple right-hand-sides of the equation $Ax=b$ to solve. If you have a single, straight equation $Ax=b$, then nothing, absolutely nothing, beats Gaussian elimination with back substitution (GEBS) for "exact" methods. I put exact in quotes because if very large systems, GEBS has worse round-off error than many of the iterative methods, so some of the iterative methods give you more accurate results. In particular, there are some very nice algorithms for sparse matrices that can be useful.

We covered it with the explanation that if you have many equations in the form of $Ax=b$ for multiple $b$ then it's faster to use $x=A^{-1}b$ where $A$ is invertible and when it isn't or $A$ isn't a square matrix, then $LU$ decomposition can be employed. So if I have a non-square matrix then the method I outlined seems to be the fastest for decomposition, correct?
 
Well, I could be wrong, but I was under the impression that $LU$ is faster than $x=A^{-1}b$ even when (obviously) $A$ is invertible. I'd have to count the operations for solving the $L$ and $U$ part, and compare to matrix multiplication.
 
Ackbach said:
Well, I could be wrong, but I was under the impression that $LU$ is faster than $x=A^{-1}b$ even when (obviously) $A$ is invertible. I'd have to count the operations for solving the $L$ and $U$ part, and compare to matrix multiplication.

It very well might be. Didn't mean to imply that in my previous post, sorry. My main point was that when $A$ isn't invertible and you are solving for multiple $b$ then some kind of decomposition is necessary.

I'm more interested in the method of finding $L$ and $U$. On Wikipedia there are a couple of methods outlined and in my book there is a different one. I think among all of these I like the one in the OP best. Is that how you would find the $LU$ decomposition?
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Replies
5
Views
1K
Replies
2
Views
964
Replies
1
Views
1K
Replies
4
Views
2K
Replies
3
Views
2K
Replies
7
Views
2K
Replies
14
Views
2K
Replies
7
Views
1K
Back
Top