Proving Triangular Matrix Inverse is Also Triangular

  • Thread starter Thread starter crd
  • Start date Start date
  • Tags Tags
    Matrices
Click For Summary
The discussion centers on the proof that the inverse of a triangular matrix is also triangular, a claim commonly found in linear algebra texts but often lacking formal proof. The argument presented relies on the properties of row operations during the matrix transformation from (A|I) to (I|A^{-1}), which preserves the triangular structure. It emphasizes that these operations do not introduce non-zero entries above or below the diagonal, maintaining the triangular form in the inverse. An elegant proof approach suggested involves induction, starting from the last row and column, ensuring that the necessary entries in the inverse remain zero as required. Overall, the conversation seeks a more refined proof method beyond brute force.
crd
Messages
14
Reaction score
0
It is stated in almost every linear algebra text i could find that the inverse of a triangular matrix is also triangular, but no proofs accompanied such statements.

I am convinced that it is the truth, but I have not been able to write anything down that I am satisfied with that doesn't rely on the argument that row operations on the matrix (A|I) to obtain (I|A^{-1}).

Since this would only be the forward pass(if A is lower triangular) and the backwards pass(if A is upper triangular) and these operations ultimately do not introduce non zero terms above/below the diagonal entries(depending on what A was), thus A^{-1} would be a triangular matrix of the same flavor.

Has anyone come across anything a little more elegant than simply brute forcing it?
 
Physics news on Phys.org
Assume you have a invertible upper triangular matrix. Consider

AA^{-1} = I

You can use induction, starting from the last row of A times the last column of A^{-1}. gives you the entry. lower left entry 1.

Then again take the last row of A and n-1 column of A^{-1}. To be able to get a zero in the identity matrix, (n,n-1) entry of A^{-1} must be zero.
...
Carry on to the upper left corner and you are done.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
9K
Replies
11
Views
3K
  • · Replies 21 ·
Replies
21
Views
12K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K