Proving Matrix Equality Using Singular Value Decomposition

In summary, the conversation is about proving the equivalence of $AA^T = BB^T$ and $A = BO$ where $O$ is an orthogonal matrix. The speaker initially suggests using singular value decomposition (SVD) but later questions its necessity. The expert clarifies that SVD is only needed for the forward direction of the proof. The forward and reverse directions are then explained, with the expert pointing out that the singular values of $A$ and $B$ must be equal if their products are equal. Finally, the expert suggests using SVD to find orthogonal matrices that satisfy the equation.
  • #1
linearishard
8
0
Hi, I have another question, if A and B are mxn matrices, how do I prove that $AA^T = BB^T$ iff $A = BO$ where $O$ is some orthogonal matrix? I think I need to use a singular value decomposition but I am not sure. Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Can you at least prove the reverse direction, that is, if $A = BO$ for some orthogonal matrix $O$, then $AA^T = BB^T$? You don't need to use SVD for this.
 
  • #3
Yeah I did that but it seemed too simple, my study guide says I should be using SVD. Is it actually unnecessary?
 
  • #4
You use SVD for the forward direction, not the reverse direction.
 
  • #5
what do you mean by that? What is the forward and reverse directions?
 
  • #6
The forward direction: If $AA^T = BB^T$, then $A = BO$ where $O$ is some orthogonal matrix. The reverse direction: If $A = BO$ where $O$ is an orthogonal matrix, then $AA^T = BB^T$.
 
  • #7
If $AA^T = BB^T$, then the singular values of $A$ are the singular values of $B$. You can write $A = U\Sigma V^T$ and $B = U\Sigma Q^T$ for some orthogonal matrices $U, V$, and $Q$. Then $A = BO$ where $O = QV^T$. Since the transpose and product of orthogonal matrices are orthogonal, $O$ is orthogonal.
 

1. What is Singular Value Decomposition (SVD)?

Singular Value Decomposition (SVD) is a mathematical technique used in linear algebra to decompose a matrix into three matrices: a left singular matrix, a diagonal matrix of singular values, and a right singular matrix. It is often used in data analysis, signal processing, and image compression.

2. What are the applications of SVD?

SVD has a wide range of applications in various fields such as image and signal processing, data compression, natural language processing, and recommendation systems. It is also used in solving linear systems, data clustering, and data approximation.

3. How does SVD work?

SVD works by decomposing a matrix into three matrices: U, Σ, and V. The matrix U contains the eigenvectors of the original matrix multiplied by its transpose, the matrix Σ contains the square root of the eigenvalues of the original matrix, and the matrix V contains the eigenvectors of the original matrix.

4. What are the advantages of using SVD?

SVD has several advantages, including its ability to handle data with missing values, its robustness to noise, and its ability to reduce the dimensionality of data while preserving important information. It is also numerically stable and can handle very large datasets.

5. Can SVD be used for data compression?

Yes, SVD can be used for data compression by reducing the dimensionality of a dataset while preserving important information. This can result in significant storage and computational savings, making it a useful tool for compressing large datasets. Additionally, SVD is used in image compression algorithms such as JPEG and MPEG.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
612
Replies
1
Views
723
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
817
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top