Singular Value Decomposition for a Given Matrix

In summary, you must take the orientation of the eigenvectors into account when solving a singular value decomposition.
  • #1
Dschumanji
153
1

Homework Statement


I want to do a singular value decomposition for the following matrix:

[itex]M = \left\lceil 1 \ \ -1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \:\:\left\lfloor 1 \ \ -1 \right\rfloor[/itex]



Homework Equations


[itex]M=U\Sigma V^{\ T}[/itex]

[itex]M^{\ T}M[/itex]

[itex]MM^{\ T}[/itex]



The Attempt at a Solution


To determine the singular values for [itex]\Sigma[/itex], I first determined the eigenvalues from [itex]M^{\ T}M[/itex] (I could have also done it from [itex]MM^{\ T}[/itex]). The eigenvalues are 4 and 0, so the singular values are 2 and 0. So I end up with the following matrix:

[itex]\Sigma = \left\lceil 2 \ \ 0 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \left\lfloor 0 \ \ 0 \right\rfloor[/itex]

The columns of [itex]V[/itex] are the eigenvectors of [itex]M^{\ T}M[/itex]. So I end up with the following:

[itex]V = \left\lceil -1 \ \ 1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \:\left\lfloor \ \ \:\: 1 \ \ 1 \right\rfloor[/itex]

The transpose of [itex]V[/itex] turns out to be no different than [itex]V[/itex]. To determine [itex]U[/itex], I find the eigenvectors of [itex]MM^{\ T}[/itex]. I end up with the following matrix:

[itex]U = \left\lceil 1 \ \ -1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \:\:\left\lfloor 1 \ \ \ \ \ \ 1 \right\rfloor[/itex]

When you scale the columns of [itex]U[/itex] and [itex]V^{\ T}[/itex] so that the matrices become orthogonal, you find that the product of [itex]U\Sigma V^{\ T}[/itex] multiplied by [itex]0.5[/itex] should yield the matrix [itex]M[/itex]. It does not turn out to be [itex]M[/itex], though. Instead I end up with the following matrix:

[itex]0.5U\Sigma V^{\ T}= \left\lceil -1 \ \ 1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \:\,\,\,\left\lfloor -1 \ \ 1 \right\rfloor[/itex]

What am I doing wrong?!
 
Last edited:
Physics news on Phys.org
  • #2
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

[tex]U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)[/tex]

This matrix, with the V you already had, will yield

[tex]M=0.5U\Sigma V^T[/tex]
 
  • #3
micromass said:
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

[tex]U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)[/tex]

This matrix, with the V you already had, will yield

[tex]M=0.5U\Sigma V^T[/tex]
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the textbooks and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

[tex]M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)[/tex]

Can have the following SVD:

[tex]M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)[/tex]

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.
 
  • #4
Dschumanji said:
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the textbooks and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

[tex]M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)[/tex]

Can have the following SVD:

[tex]M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)[/tex]

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.

OK, I spoke to quick. You certainly need to take orientation into account, but maybe you don't always want to take the same orientation.

Let's say that

[tex]M=U\Sigma V^T[/tex]

Then

[tex]\det(M)=\det(U)\det(\Sigma)\det(V)[/tex]

You know that det(U) and det(V) are 1 or -1. So, if the sign of det(M) and [itex]det(\Sigma)[/itex] are equal, then you need to choose U and V of the same orientation. If the sign is opposite, then you need to choose U and V of the opposite orientation.

The question is somewhat trickier when confronted to non-square matrices though. I agree that textbooks and websites should mention this...
 
  • #5
Ah, that indeed makes sense. Thanks again!
 

1. What is Singular Value Decomposition (SVD)?

Singular Value Decomposition (SVD) is a mathematical technique used to decompose a matrix into three separate matrices, namely U, Σ, and V. It is a powerful tool for data analysis and can be used for dimensionality reduction, data compression, and image processing.

2. What are the applications of SVD?

SVD has various applications in fields such as signal processing, image processing, data mining, and recommender systems. It is also used in natural language processing, text classification, and data compression.

3. How does SVD work?

SVD works by decomposing a matrix into three matrices, U, Σ, and V, such that the original matrix can be reconstructed by multiplying these three matrices together. The U matrix contains the left singular vectors, Σ is a diagonal matrix containing the singular values, and V contains the right singular vectors.

4. What is the importance of SVD in data analysis?

SVD is important in data analysis because it helps in identifying patterns and relationships in high-dimensional data. It can also be used for dimensionality reduction, which is useful for visualizing and understanding complex datasets. SVD is also used in data compression, which is crucial for storage and transmission of large datasets.

5. Are there any limitations of SVD?

One limitation of SVD is that it is computationally expensive for large datasets. It also assumes that the data is linearly related, which may not always be the case. Additionally, SVD may not be suitable for datasets with missing values or outliers, as it may result in inaccurate decompositions.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
233
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
522
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
537
  • Calculus and Beyond Homework Help
Replies
3
Views
329
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
298
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Back
Top