SVD vs Eigenvalue Decompositon (Diagonalizability)

In summary, the necessary conditions for diagonalizability of a matrix A (∈ℝnxn) include having n linearly independent eigenvectors, being symmetric and/or having distinct eigenvalues. However, these are only sufficient conditions and there is no restriction on a matrix to perform the SVD decomposition (A=UΣVT) on it. This means that every matrix can be decomposed into three parts: one diagonal, two orthogonal, regardless of the state of its eigenvectors. However, this does not mean that every matrix is diagonizable, as U and V are generally different and the diagonal entries in the SVD are not the eigenvalues of A itself. They are only the same if A is positive semide
  • #1
eehsun
9
0
Okay, I know that if I can't get n linearly independent eigenvectors out of a matrix A (∈ℝnxn), it is not diagonalizable
(and that some necessary conditions for diagonalizability in this regard may be being symmetric and/or having distinct eigenvalues.)
This is how things are for the usual eigenvalue decomposition (A=SΛS-1), right?

However, if I am not mistaken, there is not such a restriction present on a matrix for being able to perform the SVD decomposition (A=UΣVT) on it. I mean, every matrix, irrespective of the state of its eigenvectors, can be decomposed into a three parts, one diagonal, two orthogonal, right? So doesn't this mean that every matrix is diagonizable, regardless of its eigenvectors?

Thanks..

Edit:
I know that the answer to the question is no but I don't understand why we don't consider Σ to be a diagonalized form of A.
Please correct me wherever I am mistaken..
Thanks again.. :)
 
Last edited:
Physics news on Phys.org
  • #2
hi!
U and V is different generally, so cannot be consider diagonalizable.
Also, The diagonal entries of SVD is the square root of eigenvalue of A*A'(A'*A), so not the eigenvalue of A itself. They are the same iff A is normal.
 
  • #3
eehsun said:
Okay, I know that if I can't get n linearly independent eigenvectors out of a matrix A (∈ℝnxn), it is not diagonalizable
(and that some necessary conditions for diagonalizability in this regard may be being symmetric and/or having distinct eigenvalues.)
Those are sufficient conditions, not necessary conditions.

This is how things are for the usual eigenvalue decomposition (A=SΛS-1), right?

However, if I am not mistaken, there is not such a restriction present on a matrix for being able to perform the SVD decomposition (A=UΣVT) on it. I mean, every matrix, irrespective of the state of its eigenvectors, can be decomposed into a three parts, one diagonal, two orthogonal, right? So doesn't this mean that every matrix is diagonizable, regardless of its eigenvectors?

Thanks..

Edit:
I know that the answer to the question is no but I don't understand why we don't consider Σ to be a diagonalized form of A.
Please correct me wherever I am mistaken..
Thanks again.. :)
 
  • #4
td21 said:
hi!
U and V is different generally, so cannot be consider diagonalizable.
Also, The diagonal entries of SVD is the square root of eigenvalue of A*A'(A'*A), so not the eigenvalue of A itself. They are the same iff A is normal.

sorry... should be positive semidefinite.
 
Last edited:
  • #5


I would like to clarify some points regarding the differences between SVD and eigenvalue decomposition (diagonalizability). First of all, both SVD and eigenvalue decomposition are methods used to decompose a matrix into simpler components. However, they have different purposes and applications.

Eigenvalue decomposition is primarily used for diagonalizing a square matrix. This means that the resulting diagonal matrix contains the eigenvalues of the original matrix on its main diagonal, and the corresponding eigenvectors form the columns of the transformation matrix. This is useful for solving systems of linear equations and performing other operations on the matrix.

On the other hand, SVD is used for decomposing any matrix, not just square matrices. It decomposes a matrix into three parts: a diagonal matrix Σ, and two orthogonal matrices U and V. The diagonal matrix Σ contains the singular values of the original matrix, which are related to the eigenvalues but not the same. The matrices U and V contain the left and right singular vectors, respectively, which are not necessarily the eigenvectors of the original matrix.

So, while every square matrix may be diagonalizable using eigenvalue decomposition, not every matrix can be decomposed into a diagonal matrix using SVD. Additionally, the diagonal matrix Σ in SVD is not the same as the diagonal matrix in eigenvalue decomposition, as they contain different values.

In summary, while both SVD and eigenvalue decomposition involve diagonalization, they have different purposes and applications. Therefore, it is important to understand the differences between them and use the appropriate method depending on the problem at hand.
 

What is the difference between SVD and Eigenvalue Decomposition?

Both SVD (Singular Value Decomposition) and Eigenvalue Decomposition are methods used to break down a matrix into simpler components. However, SVD can be applied to any matrix, whereas Eigenvalue Decomposition can only be applied to square matrices. Additionally, SVD breaks down a matrix into three components (U, Σ, and V), while Eigenvalue Decomposition breaks down a matrix into two components (P and D).

When should I use SVD instead of Eigenvalue Decomposition?

SVD is more versatile and can be used for a wider range of matrices, making it a preferred choice in many applications. Additionally, SVD can handle non-square matrices and matrices with complex numbers, which Eigenvalue Decomposition cannot. However, if you are working specifically with square matrices and are interested in the eigenvectors and eigenvalues, Eigenvalue Decomposition may be a better choice.

How do SVD and Eigenvalue Decomposition relate to diagonalizability?

Diagonalizability is a property of a square matrix where it can be transformed into a diagonal matrix, making it easier to work with. Both SVD and Eigenvalue Decomposition can be used to diagonalize a square matrix, but the conditions for diagonalizability differ between the two methods. SVD requires the matrix to be square and non-singular, while Eigenvalue Decomposition requires the matrix to have a full set of linearly independent eigenvectors.

Can SVD and Eigenvalue Decomposition be used interchangeably?

No, SVD and Eigenvalue Decomposition are not interchangeable. As mentioned before, SVD can be applied to any matrix, while Eigenvalue Decomposition is limited to square matrices. Additionally, the results of the two methods are not the same, as SVD breaks down a matrix into three components while Eigenvalue Decomposition only has two components. Therefore, it is important to understand the specific properties and applications of each method before deciding which one to use.

Are there any advantages of using SVD over Eigenvalue Decomposition?

Yes, there are several advantages of using SVD over Eigenvalue Decomposition. As mentioned before, SVD can handle a wider range of matrices, including non-square and complex matrices. SVD also provides a unique solution for any given matrix, while Eigenvalue Decomposition may have multiple solutions. Additionally, SVD is more numerically stable and less sensitive to small changes in the input matrix, making it a preferred choice in many applications.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
885
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
7K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
925
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Quantum Physics
Replies
2
Views
921
  • Linear and Abstract Algebra
Replies
7
Views
2K
Back
Top