Rank of Matrices: Why Equal to Transpose?

  • Thread starter Thread starter Fu Lin
  • Start date Start date
  • Tags Tags
    Matrices rank
Click For Summary
The discussion centers on understanding why the rank of a matrix is equal to the rank of its transpose. It highlights that the number of linearly independent rows matches the number of linearly independent columns, which is a fundamental property of matrices. The conversation suggests that a proof can be constructed using basic principles of matrix multiplication and properties of elementary matrices. Key concepts include the effects of multiplying by invertible matrices and the relationship between transposes and products. Overall, the discussion emphasizes the need for a clear, elementary explanation of this mathematical property.
Fu Lin
Messages
6
Reaction score
0
The question in short is, why the rank of a matrix is equal to the rank of its transpose?

Matrix is an array of numbers. Then it's amazing to me that the number of linear independent rows coincides with the number of linear independent columns. I tried to find some fundamental answer to this question, which does not resort to concepts like singular values or eigenvalues, so that it can be explained to those who do not know linear algebra. Is there an elementary way to explain this fact?
 
Last edited:
Physics news on Phys.org
Fu Lin said:
I tried to find some fundamental answer to this question

In mathematics, such "fundamental answers" are called proofs. Start searching for one. :wink:
 
Fu Lin said:
The question in short is, why the rank of a matrix is equal to the rank of its transpose?

Matrix is an array of numbers.
No! A matrix is an object in an algebraic system with specific properties. It can be represented by an "array of numbers".

Then it's amazing to me that the number of linear independent rows coincides with the number of linear independent columns. I tried to find some fundamental answer to this question, which does not resort to concepts like singular values or eigenvalues, so that it can be explained to those who do not know linear algebra. Is there an elementary way to explain this fact?
It depends entirely upon the definition of matrix multiplication: each term in the product of two matrices is the "dot product" of a row in the first matrix with a column in the second matrix. Reversing the order enterchanges the rows and columns.
 
You need several preliminary results to prove that the rank of a matrix is equal to the rank of its transpose:
1) If A has rank r, then you can left- and right-multiply it by elementary matrices so that the rxr submatrix of the new A in the upper left corner is the identity matrix, and everywhere else is zeros.
2) Multiplying a matrix by invertible matrices does not change its rank.
3) Elementary matrices are invertible.
4) The transpose of a product is the product of the transposes (in reverse order).
5) The inverse matrix of a transpose matrix is the transpose matrix of the inverse matrix.

Now use these results to prove the theorem.
 
Last edited:
mathboy said:
You need several preliminary results to prove that the rank of a matrix is equal to the rank of its transpose:
1) If A has rank r, then you can left- and right-multiply it by elementary matrices so that the rxr submatrix of the new A in the upper left corner is the identity matrix, and everywhere else is zeros.
2) Multiplying a matrix by invertible matrices does not change its rank.
3) Elementary matrices are invertible.
4) The transpose of a product is the product of the transposes (in reverse order).
5) The inverse matrix of a transpose matrix is the transpose matrix of the inverse matrix.

Now use these results to prove the theorem.

thank you.:smile:
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
5
Views
5K
  • · Replies 5 ·
Replies
5
Views
22K