Calculation of the inverse matrix - Number of operations

Click For Summary
SUMMARY

The calculation of the inverse matrix using Gaussian elimination requires a total of $n^3 + O(n^2)$ operations, where one operation is defined as a multiplication or division. When employing LU-decomposition, the operation count increases to $\frac{4}{3}n^3 + O(n^2)$. The discussion highlights that older references may overlook addition operations, which can lead to discrepancies in reported operation counts. The Gauss-Jordan elimination method yields a similar operation count as Gaussian elimination, specifically $n^3$ multiplications and $n^3 + O(n^2)$ additions.

PREREQUISITES
  • Understanding of matrix operations and properties, specifically regular matrices.
  • Familiarity with Gaussian elimination and LU-decomposition techniques.
  • Knowledge of computational complexity, particularly operation counts in algorithms.
  • Basic concepts of numerical stability in matrix computations.
NEXT STEPS
  • Study the details of Gaussian elimination and its computational complexity.
  • Learn about LU-decomposition and its application in solving linear systems.
  • Explore QR-decomposition with Householder reflections for improved numerical stability.
  • Investigate the historical context of operation counts in matrix algorithms and their implications.
USEFUL FOR

Mathematicians, computer scientists, and engineers involved in numerical analysis, algorithm design, or anyone interested in optimizing matrix operations and understanding computational efficiency.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let A be a regular ($n\times n$)-Matrix, for which the Gauss algorithm is possible.

If we choose as the right side $b$ the unit vectors $$e^{(1)}=(1, 0, \ldots , 0)^T, \ldots , e^{(n)}=(0, \ldots , 0, 1 )^T$$ and calculate the corresponding solutions $x^{(1)}, \ldots , x^{(n)}$ then the inverse matrix is $A^{-1}=[x^{(1)}, \ldots , x^{(n)}]$.

We can calculate the inverse with $n^3+O(n^2)$ operations. (1 operation = 1 multiplication or division)
If we calculate the solutions $x^{(1)}, \ldots , x^{(n)}$ with the using the LU-decomposition we get $\frac{4}{3}n^3+O(n^2)$ operations, or not?

It is because we apply the the Gauss algorithm which requires $\frac{1}{3}n^3+O(n^2)$ operations, right?

How do we get $n^3+O(n^2)$ ?

Do we have to use an other algorithm here?
 
Physics news on Phys.org
mathmari said:
Hey! :o

Let A be a regular ($n\times n$)-Matrix, for which the Gauss algorithm is possible.

If we choose as the right side $b$ the unit vectors $$e^{(1)}=(1, 0, \ldots , 0)^T, \ldots , e^{(n)}=(0, \ldots , 0, 1 )^T$$ and calculate the corresponding solutions $x^{(1)}, \ldots , x^{(n)}$ then the inverse matrix is $A^{-1}=[x^{(1)}, \ldots , x^{(n)}]$.

We can calculate the inverse with $n^3+O(n^2)$ operations. (1 operation = 1 multiplication or division)
If we calculate the solutions $x^{(1)}, \ldots , x^{(n)}$ with the using the LU-decomposition we get $\frac{4}{3}n^3+O(n^2)$ operations, or not?

Hey mathmari! (Smile)

LU-decomposition is listed here as $\frac 23 n^3 +O(n^2)$, while QR-decomposition with Householder reflections (for numerical stability) is $\frac 43n^3+O(n^2)$. (Nerd)

mathmari said:
It is because we apply the the Gauss algorithm which requires $\frac{1}{3}n^3+O(n^2)$ operations, right?

How do we get $n^3+O(n^2)$ ?

Do we have to use an other algorithm here?

That's indeed to get the matrix in row echelon form.
Afterwards we still need to solve it for each of the n unit vectors, which takes $\frac 12 n^3 + O(n^2)$ extra if I'm no mistaken. (Thinking)
 
mathmari said:
We can calculate the inverse with $n^3+O(n^2)$ operations. (1 operation = 1 multiplication or division)

When comparing operation counts for different methods and from different references, it is perhaps useful (but maybe already known to all participating, in which case I apologize for stating the obvious) that older references sometimes neglect addition (which includes subtraction) because multiplication (which includes division) used to be the determining factor, as it was much slower.

I learned that inversion using Gaussian elimination with back-substitution costs $n^3$ multiplications (exactly) and $n^3 + O(n^2)$ additions. Interestingly, for Gauss-Jordan the count is precisely the same.

(Elimination with back-substitution for one system costs $\frac{n^3}{2} + O(n^2)$ multiplications and $\frac{n^3}{2} + O(n)$ (no typo) additions.)
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 36 ·
2
Replies
36
Views
8K