How can I convince myself that I can find the inverse of this matrix?

Click For Summary

Discussion Overview

The discussion revolves around the conditions under which an upper triangular matrix can be inverted, specifically focusing on the matrix's structure and properties such as its kernel and determinant. Participants explore various methods to ascertain the invertibility of the matrix without directly computing its inverse, including Gauss-Jordan elimination and properties of linear independence.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions how to ascertain the ability to convert an upper triangular matrix into an identity matrix without using Gauss-Jordan elimination.
  • Another participant suggests showing that the matrix has a trivial kernel, implying that the only solution to the equation Ux = 0 is the zero vector.
  • It is noted that if all diagonal entries of the matrix are non-zero, the columns are linearly independent, suggesting invertibility.
  • Some participants propose demonstrating that the determinant of the matrix is non-zero by induction on n, leading to the conclusion that the matrix is invertible.
  • There is a discussion about proving that if Ux = 0, then all entries of x must be zero, thereby confirming the kernel is trivial.
  • One participant mentions the LU decomposition as a method to find a specific inverse of the matrix.
  • Another participant raises a question about the difference between row and column vectors, leading to a clarification on their mathematical representation.
  • There is a suggestion to construct an upper triangular matrix V such that VU equals the identity matrix, outlining a method to find the entries of V.

Areas of Agreement / Disagreement

Participants express various viewpoints on how to approach proving the invertibility of the matrix, with no consensus reached on a single method. Some agree on the importance of the determinant and kernel, while others explore different approaches without resolving the overall question.

Contextual Notes

The discussion includes assumptions about the non-zero entries on the diagonal of the matrix and their implications for linear independence and invertibility. There are also references to mathematical induction and properties of determinants that remain unproven within the thread.

  • #31
Mark44 said:
The first is a row vector. The second is a column vector, which is the transpose of the row vector. IOW, the column vector is ##x^T##, using the usual notation.
swampwiz said:
I thought it was the other way around.
Not sure what you're considering to be the other way around. My reply to @Hall, which is quoted above, was a response to his asking what is the difference between ##[x_1, x_2, \dots, x_n]## and ##\begin{bmatrix}x_1 \\ x_2 \\ \dots \\ x_n \end{bmatrix}##.
Rows are horizontal and columns (like the columns of a building are vertical, so the first vector above is a row vector, and the second vector is a column vector.

If your confusion is with the notation ##x^T##, a transpose can be either a row vector or a column vector, depending on how ##x## is originally defined.
 
Physics news on Phys.org
  • #32
Mark44 said:
Not sure what you're considering to be the other way around. My reply to @Hall, which is quoted above, was a response to his asking what is the difference between ##[x_1, x_2, \dots, x_n]## and ##\begin{bmatrix}x_1 \\ x_2 \\ \dots \\ x_n \end{bmatrix}##.
Rows are horizontal and columns (like the columns of a building are vertical, so the first vector above is a row vector, and the second vector is a column vector.

If your confusion is with the notation ##x^T##, a transpose can be either a row vector or a column vector, depending on how ##x## is originally defined.
What I was saying was the I thought the nominal form of a vector is as a column, not as a row. Certainly if written as A x, x is a column vector.
 
  • #33
swampwiz said:
What I was saying was the I thought the nominal form of a vector is as a column, not as a row. Certainly if written as A x, x is a column vector.
I don't think there is a nominal form of a vector. However, in the context of the expression Ax, with A being a matrix, x would have to be a column vector.
 
  • #34
More generally, given an ##n \times n## matrix, the vector must be ##n \times 1## for the product to be defined. So, yes, a column vector.
 
  • #35
swampwiz said:
What about the eigenproblem equation? A x = 0, but the eigenvectors are solutions of x that are NOT 0; of course, this is possible because the determinant of A must be 0 for this to work.
The eigenproblem EQ is [ A ]{ x } = λ { x }, which leads to [ [ A ] - λ [ I ] ] { x } = { 0 }, and only works for the case of Δ( [ [ A ] - λ [ I ] ] ) = 0. Eigenvectors correspond to the nullspace of a matrix.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
4
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K