Matrix with Orthonormal columns

In summary, the discussion is about the properties of QQ^T, where Q is a rectangular matrix with orthonormal columns. It is stated that QQ^T is a symmetric matrix of dimension m x m, but can also be singular. The question is raised as to why this is the case. Further discussion reveals that QQ^T has rank n, where n is the number of columns in Q. It is also noted that the rank of QQ^T is equal to the rank of both ATA and AAT. This is explained by the fact that no new information is added when forming these matrix products.
  • #1
hotvette
Homework Helper
996
5
If Q is an m x n (m > n) matrix with orthonormal columns, we know that [itex]Q^TQ = I[/itex] of dimension n x n. I have a question about [itex]QQ^T[/itex]. It is a symmetric m x m matrix but also appears to be singular. Why would it be singular? Probably something basic I've long since forgotten.
 
Physics news on Phys.org
  • #2
hotvette said:
If Q is an m x n (m > n) matrix with orthonormal columns, we know that [itex]Q^TQ = I[/itex] of dimension n x n. I have a question about [itex]QQ^T[/itex]. It is a symmetric m x m matrix but also appears to be singular. Why would it be singular? Probably something basic I've long since forgotten.
Why do you say it "appears to be singular"? You can take Q= I as a matrix with orthonormal matrix such that [itex]QQ^T[/itex] is not singular.
 
  • #3
Thanks for your reply. I'm dealing specifically with non-square matrices with more rows than columns where I'm trying to factor [itex]QQ^T[/itex] using QR or SVD. The result is always nonsense so I tried an experiment by generating random non-square matrices with orthonormal columns and in all cases [itex]QQ^T[/itex] had condition numbers ~1E15 and the factorization routines broke down. I figured there must be some rule or axiom I'd forgotten.

This came up when trying to simplify the matrix equation [itex]A^TDAx = A^Tb[/itex] where A is non-square with more rows than columns and D is diagonal. If A is factored into QR the simplification gets stalled because I end up with [itex]QQ^T[/itex] on both sides and can't go further.

The broader question is whether [itex]A^TDAx = A^Tb[/itex] can be solved for x without having to carry out the matrix multiplications [itex]A^TDA[/itex], where A is non-square with more rows than columns and D is diagonal.
 
Last edited:
  • #4
I managed to answer the broader question after realizing I made a goof in the equation. It really is ATDAx = ATDb and can be readily solved by letting B = D1/2A. The problem is then BTBx = BTD1/2b which is easy to solve.

On the other question, I'm still curious whether a generalization can be made about the nature of QQT in the case where Q is rectangular (m > n) and has orthonormal columns. I've scoured my linear algebra texts and can't find any statements about this.
 
  • #5
hotvette said:
If Q is an m x n (m > n) matrix with orthonormal columns, we know that [itex]Q^TQ = I[/itex] of dimension n x n. I have a question about [itex]QQ^T[/itex]. It is a symmetric m x m matrix but also appears to be singular. Why would it be singular? Probably something basic I've long since forgotten.

[itex]QQ^T[/itex] has rank n, so if n < m then it is singular.
 
  • #6
Thanks. Actually, I realized it doesn't matter that the columns are orthonormal. I found the relation I was looking for:

rank(A) = rank(ATA) = rank(AAT)

It makes intuitive sense since you aren't adding any information by forming the matrix products.

http://en.wikipedia.org/wiki/Rank_(linear_algebra )
 
Last edited by a moderator:

What is a matrix with orthonormal columns?

A matrix with orthonormal columns is a square matrix where each column is a unit vector and all columns are mutually orthogonal (perpendicular) to each other. This means that the dot product of any two columns is equal to 0, and the magnitude of each column is equal to 1.

What are the properties of a matrix with orthonormal columns?

A matrix with orthonormal columns has several important properties, including:

  • All columns are perpendicular to each other.
  • The dot product of any two columns is equal to 0.
  • The magnitude of each column is equal to 1.
  • The matrix is invertible.
  • The transpose of the matrix is equal to its inverse.

How is a matrix with orthonormal columns used in mathematics?

A matrix with orthonormal columns is commonly used in mathematics for various purposes, such as:

  • Orthogonal projection and least squares approximation.
  • Eigenvalue decomposition.
  • Rotation and reflection transformations.
  • Computing the inverse and determinant of a matrix.

What is the difference between a matrix with orthonormal columns and an orthogonal matrix?

A matrix with orthonormal columns and an orthogonal matrix are closely related, but they are not exactly the same. An orthogonal matrix is a square matrix where all rows and columns are orthogonal to each other and have a magnitude of 1. A matrix with orthonormal columns is a specific type of orthogonal matrix where only the columns are required to be orthogonal and have a magnitude of 1.

How do you check if a matrix has orthonormal columns?

To check if a matrix has orthonormal columns, you can perform the following steps:

  1. Calculate the dot product of each pair of columns. If all dot products are equal to 0, then the columns are orthogonal.
  2. Calculate the magnitude of each column. If all magnitudes are equal to 1, then the columns are normalized.
  3. If both conditions are satisfied, then the matrix has orthonormal columns.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
850
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
3
Views
2K
Replies
7
Views
820
Replies
27
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
915
  • Linear and Abstract Algebra
Replies
2
Views
914
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top