Are the Rows of an Invertible Matrix Linearly Independent?

In summary, the invertible matrix theorem states that the columns of a given matrix form a linearly independent set, and this also applies to the rows due to the relationship between the row and column spaces. However, note that the spaces spanned by the rows and columns are different, though there is an isomorphism between them for an n x n matrix.
  • #1
jeff1evesque
312
0
The invertible matrix theorem states that the columns of the given matrix form a linearly independent set. Can we argue that the rows of the same matrix also forms a linearly independent set? If a matrix is invertible, then inverse of a given matrix [tex]A^{-1}[/tex] has it's columns being linearly independent (for [tex]A^{-1}[/tex]) which is equivalent to the rows of A being linearly independent. So can we say that both the rows and columns of A form a linearly independent set?

Thanks,

JL
 
Physics news on Phys.org
  • #2
Yes. Keep in mind that [tex]\det(A) = \det(A^{T})[/tex], hence every portion of the invertible matrix theorem automatically applies to the rows as well as the columns. You should come to see that there is a relationship between the row space and the column space of a matrix, along with the null space, called the rank-nullity theorem.

Note that the space spanned by the rows is different then the space spanned by the columns since row vectors live in a different vector space than the column vectors. (You will find that there is an isomorphism between the two spaces if the matrix is n x n.)
 
Last edited:

1. What is the Invertible Matrix Theorem?

The Invertible Matrix Theorem is a theorem in linear algebra that states that a square matrix is invertible if and only if its determinant is non-zero.

2. What does it mean for a matrix to be invertible?

A square matrix is invertible if it has an inverse matrix, which when multiplied with the original matrix, gives the identity matrix. This means that the inverse matrix can "undo" the original matrix.

3. How do you determine if a matrix is invertible?

To determine if a matrix is invertible, you can calculate its determinant. If the determinant is non-zero, then the matrix is invertible. Alternatively, you can perform row reduction on the matrix and check if it reduces to the identity matrix.

4. What is the significance of the Invertible Matrix Theorem?

The Invertible Matrix Theorem is significant because it provides a simple and efficient way to determine if a matrix is invertible. It also allows us to solve systems of linear equations using matrix operations.

5. Are there any exceptions to the Invertible Matrix Theorem?

Yes, there are a few exceptions to the Invertible Matrix Theorem. For example, a matrix with a determinant of zero can still be invertible if it has a non-zero row or column. Additionally, non-square matrices do not have an inverse and are therefore not invertible.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
948
  • Calculus and Beyond Homework Help
Replies
1
Views
284
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
572
  • Calculus and Beyond Homework Help
Replies
2
Views
989
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
976
  • Linear and Abstract Algebra
Replies
8
Views
882
  • Linear and Abstract Algebra
Replies
14
Views
1K
Back
Top