Singular Value Decomposition for a Given Matrix

Dschumanji
Messages
153
Reaction score
1

Homework Statement


I want to do a singular value decomposition for the following matrix:

M = \left\lceil 1 \ \ -1 \right\rceil
\ \ \ \ \ \ \ \:\:\left\lfloor 1 \ \ -1 \right\rfloor



Homework Equations


M=U\Sigma V^{\ T}

M^{\ T}M

MM^{\ T}



The Attempt at a Solution


To determine the singular values for \Sigma, I first determined the eigenvalues from M^{\ T}M (I could have also done it from MM^{\ T}). The eigenvalues are 4 and 0, so the singular values are 2 and 0. So I end up with the following matrix:

\Sigma = \left\lceil 2 \ \ 0 \right\rceil
\ \ \ \ \ \ \ \left\lfloor 0 \ \ 0 \right\rfloor

The columns of V are the eigenvectors of M^{\ T}M. So I end up with the following:

V = \left\lceil -1 \ \ 1 \right\rceil
\ \ \ \ \ \ \:\left\lfloor \ \ \:\: 1 \ \ 1 \right\rfloor

The transpose of V turns out to be no different than V. To determine U, I find the eigenvectors of MM^{\ T}. I end up with the following matrix:

U = \left\lceil 1 \ \ -1 \right\rceil
\ \ \ \ \ \ \:\:\left\lfloor 1 \ \ \ \ \ \ 1 \right\rfloor

When you scale the columns of U and V^{\ T} so that the matrices become orthogonal, you find that the product of U\Sigma V^{\ T} multiplied by 0.5 should yield the matrix M. It does not turn out to be M, though. Instead I end up with the following matrix:

0.5U\Sigma V^{\ T}= \left\lceil -1 \ \ 1 \right\rceil
\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \:\,\,\,\left\lfloor -1 \ \ 1 \right\rfloor

What am I doing wrong?!
 
Last edited:
Physics news on Phys.org
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)

This matrix, with the V you already had, will yield

M=0.5U\Sigma V^T
 
micromass said:
Hi Dschumanji! :smile:

You must choose the basis of eigenvectors of the same orientation. That is, you must choose U and V such that det(U)=det(V). This is not the case here. A solution is to set

U=\left(\begin{array}{cc} -1 & -1\\ -1 & 1\\ \end{array}\right)

This matrix, with the V you already had, will yield

M=0.5U\Sigma V^T
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the textbooks and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)

Can have the following SVD:

M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.
 
Dschumanji said:
Thanks, Micromass, for the help and also for providing an example of how to make a matrix with Latex! All of the textbooks and tutorials online never mention that the relative orientation of the eigenvectors should be the same for both U and V. I have been trying this suggestion out for other SVD problems and it seems to work pretty well. However, there are some problems where the relative orientation of the vectors does not matter. For example, the really simple matrix:

M=\left(\begin{array}{cc} -3 & 0\\ 0 & 0\\ \end{array}\right)

Can have the following SVD:

M = \left(\begin{array}{cc} -1 & 0\\ 0 & 1\\ \end{array}\right)\left(\begin{array}{cc} 3 & 0\\ 0 & 0\\ \end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array}\right)

The relative orientation of the eigenvectors in U are not the same as in V. The determinants of U and V are also not the same.

OK, I spoke to quick. You certainly need to take orientation into account, but maybe you don't always want to take the same orientation.

Let's say that

M=U\Sigma V^T

Then

\det(M)=\det(U)\det(\Sigma)\det(V)

You know that det(U) and det(V) are 1 or -1. So, if the sign of det(M) and det(\Sigma) are equal, then you need to choose U and V of the same orientation. If the sign is opposite, then you need to choose U and V of the opposite orientation.

The question is somewhat trickier when confronted to non-square matrices though. I agree that textbooks and websites should mention this...
 
Ah, that indeed makes sense. Thanks again!
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top