Transforming a matrix to orthogonal one

In summary, given a matrix X of size n x p with p linearly independent columns, it is possible to find a matrix A of size p x p such that Y=XA and Y^TY=I, where Y is the result of applying the "Gram-Schmidt orthogonalization" process to the columns of X. This can be done by taking the square root of X^*X and using it to construct A, which will change the representation of a vector from the original basis to an orthonormal basis.
  • #1
onako
86
0
Suppose a matrix X of size n x p is given, n>p, with p linearly independent columns. Can it be guaranteed that there exists a matrix A of size p x p that converts columns of X to orthonormal columns. In other words, is there an A, such that Y=XA, and Y^TY=I, where I is an p x p identity matrix.
 
Physics news on Phys.org
  • #2
Yes. Since the columns of X are independent, they for a basis for Rn. We can then use the "Gram-Schmidt orthogonalization" process to construct an orthonormal basis from them. A will be the "change of basis" matrix that changes representation of a vector in the original basis to representation in the orthonormal basis.
 
  • #3
Thanks. Just one note: I suppose you've taken into account that there are p columns in X (which is an n x p matrix). If I'm not wrong, only n linearly independent columns of dimensionality R^n define a basis in R^n.

So, given an input X, with linearly independent columns, such columns could be transformed by GS processing to yield Y, such that Y^TY=I, and there exists A, such that Y=XA. How could one calculate such A a priori?
 
  • #4
Yes Onako, it is true, but for a different reason, than stated by HallsofIvy.

If the columns of [itex]X[/itex] are linearly independent, [itex]X^*X[/itex] is an invertible [itex]p\times p[/itex] matrix (here [itex]X^*[/itex] is the Hermitian conjugate of [itex]X[/itex], i.e. the conjugate transpose of [itex]X[/itex] ; if [itex]X[/itex] is real then [itex]X^*=X^T[/itex] (I use [itex]X^*[/itex] only because what I say works for complex matrices as well).

Matrix [itex]X^*X[/itex] is positive semidefinite for all [itex]X[/itex] , and since [itex]X^*X[/itex] is invertible, [itex]X^*X[/itex] is positive definite (all eigenvalues are positive). Since the matrix [itex]X^*X[/itex] is Hermitian (symmetric if [itex]X[/itex] is real), it is can be diagonalized, i.e. it can be represented as a diagonal matrix in some orthonormal basis, or equivalently, it can be written as [itex]X^*X =U^* D U[/itex], where [itex]U[/itex] is a unitary matrix ([itex]U^{-1}=U^*[/itex]) and [itex]D[/itex] is a diagonal matrix with eigenvalues of [itex]X^*X[/itex] on the diagonal.

We can take a square root of [itex]X^*X[/itex], namely [itex]B = U^* D^{1/2} U[/itex], where [itex]D^{1/2}[/itex] is obtained by taking square roots of diagonal entries of [itex]D[/itex] (recall that [itex]D[/itex] is a diagonal matrix). Then [itex]B^*=B[/itex], and [itex]X^*X = B^2[/itex], and [itex]A=B^{-1}[/itex] is the matrix you want.

Indeed, if [itex]Y=XA[/itex], then [itex]Y^*Y = A^* X^*X A = A^* B^2 A =A B^2 A =I[/itex].
 
  • #5
Thank your for such a good explanation.
 

Related to Transforming a matrix to orthogonal one

1. How is a matrix transformed to an orthogonal one?

A matrix is transformed to an orthogonal one through a process called orthogonalization, which involves applying a series of mathematical operations to the matrix to create a new matrix that is orthogonal to the original.

2. What is the purpose of transforming a matrix to an orthogonal one?

The main purpose of transforming a matrix to an orthogonal one is to simplify calculations and improve accuracy, particularly in linear algebra and vector calculus. Orthogonal matrices have special properties that make them useful in many applications.

3. What are some common methods for transforming a matrix to an orthogonal one?

Some common methods for transforming a matrix to an orthogonal one include Gram-Schmidt orthogonalization, Householder transformation, and Givens rotation. These methods involve a combination of orthogonalization and matrix factorization techniques.

4. Can any matrix be transformed to an orthogonal one?

No, not all matrices can be transformed to an orthogonal one. Only square matrices with linearly independent columns can be transformed to an orthogonal one. This is because an orthogonal matrix must have the same number of rows and columns, and its column vectors must be mutually perpendicular.

5. How is the orthogonality of a matrix determined?

The orthogonality of a matrix is determined by checking whether its column vectors are orthogonal to each other. This can be done by calculating the dot product of each pair of column vectors, which should result in 0 if they are orthogonal. A matrix is considered orthogonal if all of its column vectors are mutually orthogonal and have a magnitude of 1.

Similar threads

  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
952
  • Linear and Abstract Algebra
Replies
1
Views
771
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
4
Views
2K
Replies
12
Views
3K
Replies
3
Views
1K
Back
Top