Transforming a matrix to orthogonal one

  • Context: Graduate 
  • Thread starter Thread starter onako
  • Start date Start date
  • Tags Tags
    Matrix Orthogonal
Click For Summary
SUMMARY

The discussion confirms that for a matrix X of size n x p with p linearly independent columns, there exists a matrix A of size p x p that transforms X into an orthonormal matrix Y, satisfying Y^TY=I. The Gram-Schmidt orthogonalization process is utilized to construct the orthonormal basis. The matrix A can be calculated as the inverse of the square root of the product X^*X, where X^* is the Hermitian conjugate of X. This ensures that A transforms the original basis to the orthonormal basis effectively.

PREREQUISITES
  • Understanding of linear independence in matrices
  • Familiarity with Gram-Schmidt orthogonalization
  • Knowledge of Hermitian matrices and their properties
  • Basic concepts of matrix diagonalization
NEXT STEPS
  • Study the Gram-Schmidt orthogonalization process in detail
  • Learn about the properties of Hermitian matrices and their applications
  • Explore matrix diagonalization techniques and their significance
  • Investigate the computation of matrix inverses and square roots in linear algebra
USEFUL FOR

Mathematicians, data scientists, and engineers involved in linear algebra, particularly those working with matrix transformations and orthogonalization techniques.

onako
Messages
86
Reaction score
0
Suppose a matrix X of size n x p is given, n>p, with p linearly independent columns. Can it be guaranteed that there exists a matrix A of size p x p that converts columns of X to orthonormal columns. In other words, is there an A, such that Y=XA, and Y^TY=I, where I is an p x p identity matrix.
 
Physics news on Phys.org
Yes. Since the columns of X are independent, they for a basis for Rn. We can then use the "Gram-Schmidt orthogonalization" process to construct an orthonormal basis from them. A will be the "change of basis" matrix that changes representation of a vector in the original basis to representation in the orthonormal basis.
 
Thanks. Just one note: I suppose you've taken into account that there are p columns in X (which is an n x p matrix). If I'm not wrong, only n linearly independent columns of dimensionality R^n define a basis in R^n.

So, given an input X, with linearly independent columns, such columns could be transformed by GS processing to yield Y, such that Y^TY=I, and there exists A, such that Y=XA. How could one calculate such A a priori?
 
Yes Onako, it is true, but for a different reason, than stated by HallsofIvy.

If the columns of X are linearly independent, X^*X is an invertible p\times p matrix (here X^* is the Hermitian conjugate of X, i.e. the conjugate transpose of X ; if X is real then X^*=X^T (I use X^* only because what I say works for complex matrices as well).

Matrix X^*X is positive semidefinite for all X , and since X^*X is invertible, X^*X is positive definite (all eigenvalues are positive). Since the matrix X^*X is Hermitian (symmetric if X is real), it is can be diagonalized, i.e. it can be represented as a diagonal matrix in some orthonormal basis, or equivalently, it can be written as X^*X =U^* D U, where U is a unitary matrix (U^{-1}=U^*) and D is a diagonal matrix with eigenvalues of X^*X on the diagonal.

We can take a square root of X^*X, namely B = U^* D^{1/2} U, where D^{1/2} is obtained by taking square roots of diagonal entries of D (recall that D is a diagonal matrix). Then B^*=B, and X^*X = B^2, and A=B^{-1} is the matrix you want.

Indeed, if Y=XA, then Y^*Y = A^* X^*X A = A^* B^2 A =A B^2 A =I.
 
Thank your for such a good explanation.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 27 ·
Replies
27
Views
2K