Solving Ax=b; when to use LU decomposition?

  • Context: Undergrad 
  • Thread starter Thread starter hkBattousai
  • Start date Start date
  • Tags Tags
    Decomposition
Click For Summary

Discussion Overview

The discussion revolves around methods for solving the linear equation Ax = b, specifically comparing the use of matrix inversion and LU decomposition. Participants explore the efficiency and robustness of these methods, particularly in the context of programming implementations and numerical stability.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant suggests using LU decomposition as a method to solve Ax = b, noting that it may be faster than calculating the inverse of A.
  • Another participant argues that LU decomposition is generally faster for matrices larger than 1x1, while cautioning that special cases may exist for small matrices.
  • A participant mentions that calculating the inverse using LU decomposition involves solving multiple sets of equations, which may be more work than directly solving Ax = b.
  • Concerns are raised about the robustness of algorithms, particularly regarding rounding errors when matrices are close to singular.
  • Several participants clarify that to find the inverse of A, the b vector should be the identity matrix, with one participant providing examples of the b vectors for a 4x4 matrix.
  • Another participant confirms the understanding that substituting columns of the identity matrix for the b vector allows for the calculation of the inverse of A by merging the resulting x vectors.
  • A final comment emphasizes that while LU decomposition is generally beneficial, computing the inverse of a matrix should be avoided for reasons of speed and stability.

Areas of Agreement / Disagreement

Participants generally agree on the advantages of LU decomposition over matrix inversion, particularly for larger matrices. However, there remains some uncertainty about the best approach for smaller matrices and the implications of numerical stability.

Contextual Notes

Participants express varying levels of understanding regarding the method for calculating the inverse of A, indicating potential gaps in knowledge about the application of the identity matrix as the b vector. Additionally, concerns about rounding errors highlight the need for careful consideration of numerical methods.

Who May Find This Useful

Readers interested in numerical methods for solving linear equations, particularly in programming contexts, may find this discussion relevant.

hkBattousai
Messages
64
Reaction score
0
I want to solve the linear equation below:

Ax = b


For this purpose, I'm writing a C++ code. I have written both routines for decomposing A matrix to L and U matrices, and for calculating inverse of A matrix.

I may multiply both sides with A-1:

Ax = b
A^{-1}Ax = A^{-1}b
x = A^{-1}b

Or, I can use LU decomposition:

Ax = b
A = LU
Ax = LUx = Ly = b
Solve\,\,\,\,\, Ly = b\,\,\,\,\, for\,\,\,\,\, y
Solve\,\,\,\,\, Ux = y\,\,\,\,\, for\,\,\,\,\, x


LU decomposition method is said to be faster. But, I'm not sure if these rumors are true for all cases. I have a feeling that the first method (matrix inversion method) would be faster for smaller A matrices.

My question is, how do I prefer which method to use?
 
Physics news on Phys.org
hkBattousai said:
LU decomposition method is said to be faster. But, I'm not sure if these rumors are true for all cases.

It is faster for any matrix of order greater than 1x1. Well, to be fair, you can write special code for a 2x2 matrix, but the amount of computation is so tiny that it's hardly worth the bother, unless you need to solve 100 million sets of 2x2 equations (and there are numerical applications that DO need to do that sort of thing.)

For at least 999 algorithms out of 1000, finding the explicit inverse of a matrix is not the fastest way, though "beginners" are often tempted to just "translate" math equations containing an inverse matrices into a programming language.

Actually, a fast and reliable way to calculate the inverse of an NxN matrix is to first find the LU decomposition, and then solve N sets of equations where the "b" vectors have one 1 and the other terms all zero, to find the columns of the inverse matrix one at a time. That is already more work than solving one set of equations using LU decomposition.
 
Thank you for your guidance. I will stick to LU decomposition.


AlephZero said:
Actually, a fast and reliable way to calculate the inverse of an NxN matrix is to first find the LU decomposition, and then solve N sets of equations where the "b" vectors have one 1 and the other terms all zero, to find the columns of the inverse matrix one at a time. That is already more work than solving one set of equations using LU decomposition.

This is an interesting information.
I didn't understand what b vector to choose for calculating the inverse of A. Can you please a b vector and explain the method to find the inverse of A matrix?
 
Did you consider how robust the algorithms are?
An important concern is usually to minimize the result of rounding errors.
This is especially important when the matrix is close to singular.
 
To calculate the inverse of matrix A (n x n), the b matrix is equal to the n x n Identity Matrix (only 1s on the main diagonal, all zeros elsewhere).
 
SteamKing said:
To calculate the inverse of matrix A (n x n), the b matrix is equal to the n x n Identity Matrix (only 1s on the main diagonal, all zeros elsewhere).
In the Ax=b equation I'm working with, A is a square matrix, x and b are column vectors.

Like this:
lineq.png


So, how do I make the b vector an nxn identity matrix?
 
AlephZero said:
solve N sets of equations where the "b" vectors have one 1 and the other terms all zero, to find the columns of the inverse matrix one at a time.

hkBattousai said:
I didn't understand what b vector to choose for calculating the inverse of A. Can you please a b vector and explain the method to find the inverse of A matrix?
AlephZero told you that. For, say, a 4 by 4 matrix, use
\begin{bmatrix}1 \\ 0 \\ 0 \\ 0\end{bmatrix}, \begin{bmatrix}0 \\ 1 \\ 0 \\ 0\end{bmatrix}, \begin{bmatrix}0 \\ 0 \\ 1 \\ 0\end{bmatrix}, and \begin{bmatrix}0 \\ 0 \\ 0 \\ 1\end{bmatrix}
 
Hmm. I will substitute columns of the identity matrix for b vector for n times to find n different x vectors. Next, I will merge these x vectors to form the inverse of the A matrix. Did I understand it right?
 
^Yes that is right.
The LU decomposition is basically free, so even when it does not help it cannot hurt. Computing the inverse is best avoided if possible, both for speed and stability.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K