Singular Value Decomposition

  • Thread starter Dschumanji
  • Start date
  • #1
153
1

Homework Statement


I want to do a singular value decomposition for the following matrix:

[itex]M = \left\lceil 1 \ \ -1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \:\:\left\lfloor 1 \ \ -1 \right\rfloor[/itex]



Homework Equations


[itex]M=U\Sigma V^{\ T}[/itex]

[itex]M^{\ T}M[/itex]

[itex]MM^{\ T}[/itex]



The Attempt at a Solution


To determine the singular values for [itex]\Sigma[/itex], I first determined the eigenvalues from [itex]M^{\ T}M[/itex] (I could have also done it from [itex]MM^{\ T}[/itex]). The eigenvalues are 4 and 0, so the singular values are 2 and 0. So I end up with the following matrix:

[itex]\Sigma = \left\lceil 2 \ \ 0 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \left\lfloor 0 \ \ 0 \right\rfloor[/itex]

The columns of [itex]V[/itex] are the eigenvectors of [itex]M^{\ T}M[/itex]. So I end up with the following:

[itex]V = \left\lceil -1 \ \ 1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \:\left\lfloor \ \ \:\: 1 \ \ 1 \right\rfloor[/itex]

The transpose of [itex]V[/itex] turns out to be no different than [itex]V[/itex]. To determine [itex]U[/itex], I find the eigenvectors of [itex]MM^{\ T}[/itex]. I end up with the following matrix:

[itex]U = \left\lceil 1 \ \ -1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \:\:\left\lfloor 1 \ \ \ \ \ \ 1 \right\rfloor[/itex]

When you scale the columns of [itex]U[/itex] and [itex]V^{\ T}[/itex] so that the matrices become orthogonal, you find that the product of [itex]U\Sigma V^{\ T}[/itex] multiplied by [itex]0.5[/itex] should yield the matrix [itex]M[/itex]. It does not turn out to be [itex]M[/itex], though. Instead I end up with the following matrix:

[itex]0.5U\Sigma V^{\ T}= \left\lceil -1 \ \ 1 \right\rceil[/itex]
[itex]\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \:\,\,\,\left\lfloor -1 \ \ 1 \right\rfloor[/itex]

What am I doing wrong?!
 
Last edited:

Answers and Replies

  • #2
22,129
3,298
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

[tex]U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)[/tex]

This matrix, with the V you already had, will yield

[tex]M=0.5U\Sigma V^T[/tex]
 
  • #3
153
1
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

[tex]U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)[/tex]

This matrix, with the V you already had, will yield

[tex]M=0.5U\Sigma V^T[/tex]
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the text books and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

[tex]M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)[/tex]

Can have the following SVD:

[tex]M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)[/tex]

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.
 
  • #4
22,129
3,298
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the text books and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

[tex]M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)[/tex]

Can have the following SVD:

[tex]M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)[/tex]

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.

OK, I spoke to quick. You certainly need to take orientation into account, but maybe you don't always want to take the same orientation.

Let's say that

[tex]M=U\Sigma V^T[/tex]

Then

[tex]\det(M)=\det(U)\det(\Sigma)\det(V)[/tex]

You know that det(U) and det(V) are 1 or -1. So, if the sign of det(M) and [itex]det(\Sigma)[/itex] are equal, then you need to choose U and V of the same orientation. If the sign is opposite, then you need to choose U and V of the opposite orientation.

The question is somewhat trickier when confronted to non-square matrices though. I agree that textbooks and websites should mention this...
 
  • #5
153
1
Ah, that indeed makes sense. Thanks again!
 

Related Threads on Singular Value Decomposition

  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
2
Views
4K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
9
Views
2K
  • Last Post
Replies
0
Views
2K
Replies
1
Views
2K
  • Last Post
Replies
0
Views
2K
Replies
2
Views
3K
Replies
2
Views
1K
Top