Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Singular Value Decomposition trouble

  1. Jun 23, 2005 #1
    Suppose I want to decompose [tex]A = \left(\begin{array}{cc}4&4\\-3&3\end{array}\right)[/tex].

    [tex]A = U \Sigma V^T => A^T A = V \Sigma^2 V^T[/tex] and [tex]A A^T = U \Sigma^2 U^T[/tex]

    So 2 independent eigenvectors of [tex]A^T A[/tex] are a basis for the row space of A and 2 independent eigenvectors of [tex]A A^T[/tex] are a basis for the column space of A.

    If I pick
    [tex]v_{1} = \left(\begin{array}{cc}1/\sqrt{2}\\1/\sqrt{2}\end{array}\right)[/tex] [tex]v_{2} = \left(\begin{array}{cc}1/\sqrt{2}\\-1/\sqrt{2}\end{array}\right)[/tex]

    and

    [tex]u_{1} = \left(\begin{array}{cc}1\\0\end{array}\right)[/tex] [tex]u_{2} = \left(\begin{array}{cc}0\\1\end{array}\right)[/tex]

    as these bases, respectively, I get into trouble. Matrix A will not equal the singular value decomposition, there will be a sign error even though these vectors truly are correct eigenvectors.

    For those who know him, this is an example that prof. Strang gave during one of his video lectures but he didn't quite see where the error came from. And since he never gave a solution, I hope I can find one here.

    I started thinking about this and I noticed that these 2 bases don't have the same orientation. So I tried to find a new basis for one of the 2 spaces, with respect to the eigenvalues: just change one vector into the opposite direction. But the problem now is, this only works half the time. So for the 4 new bases I found (with respect to the eigenvalues) only 2 work out the way it's supposed to.

    So if I change [tex]u_{2}[/tex] to [tex]\left(\begin{array}{cc}0\\-1\end{array}\right)[/tex] everything works out. But if I change [tex]u_{1}[/tex] to [tex]\left(\begin{array}{cc}-1\\0\end{array}\right)[/tex] things get into trouble again.
    Similar for [tex]v_{1}[/tex] and [tex]v_{2}[/tex].

    Can anyone tell me what is going on here?

    Phew, that was a lot of typing. :tongue2: I hope I didn't make any errors, but my main point is there and that's what counts.

    Thanks in advance.
     
  2. jcsd
  3. Jun 25, 2005 #2
    Hm, I guess this is not a popular subject then..
     
  4. Jun 26, 2005 #3

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    checking other people errors is seldom entertaining work.

    i for one do not understand your language. what is a "singular value" decomposition? is a singular value what everyone else calls an eigenvalue?

    im too lazy to read your post until i understand it in my own language.
     
    Last edited: Jun 26, 2005
  5. Jun 27, 2005 #4
  6. Jun 27, 2005 #5

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    if you would try to answer some of my questions i would try to answer yours. :devil:
     
    Last edited: Jun 27, 2005
  7. Jun 28, 2005 #6
    I cannot explain it better than on the webpage I linked to.

    Singular values appear to be the positive square roots of the eigenvalues of [tex]A^T A[/tex] and [tex]A A^T[/tex] for any sized matrix A. I think they play a similar role to eigenvalues for square matrices.

    I think SVD is a relative new subject. It factors a matrix out into 2 different orthogonal square matrices and one m by n diagonal matrix (which can contain zeros on the diagonal).
    It takes an orthogonal basis in the row space to an orthogonal basis in the column space with the entries on the diagonal matrix as multiplying factors.

    Hope this was helpful..
     
  8. Jun 28, 2005 #7
    TOM

    what do you study ?

    marlon
     
  9. Jun 28, 2005 #8

    George Jones

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The decompositions [itex] A^T A = V \Sigma^2 V^T[/itex] and [itex]A A^T = U \Sigma^2 U^T[/itex] are not enough to determine the decomposition [itex]A = U \Sigma V^T [/itex]. As you have found, there are sign ambiguities. For example, if [itex] U [/itex] is replaced by [itex] U' = -U [/itex], then [itex] U' [/itex] still consists of normalized eigenvectors of [itex] A A^T [/itex], but [itex] A [/itex] goes to [itex]-A[/itex].

    According to the "Geometric meaning" section of the Wikipedia website, you need to have that [itex]Av_i[/itex] is a non-negative multiple of [itex]u_i[/itex]. This is true for your [itex]v_1[/itex] and [itex]u_1[/itex], but it is not true for your [itex]v_2[/itex] and [itex]u_2[/itex].

    Regards,
    George
     
  10. Jun 28, 2005 #9

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    Tomme, the point is that even if the webpage explains it well to me, it would be better for you if you tried to understand it well enough to explain it yourself. then you would be on your way to answering your own question. :smile:
     
  11. Jun 28, 2005 #10

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    well you got my interest. actually iw as not famkiliar with that theorem, and even found it almost unbelievable at first. for example it says geometrically that a linear isomorphism from R^2 to R^2, even if it is not length or angle preserving, nonetheless takes SOME orthogonal basis of R^2 to another orthogonal basis of R^2. NOT orthonormal!!


    for instance if the standard basis e1,e2, goes to e1, e1+e2, then the image vectors make an angel of 45 degrees. But if we rotate the basis {e1,e2} counterclockwise, then by the time they get to the position of {e2,-e1} their images make an angle of 135 degrees. so at some intermediate point their images were perpendicular.

    so the point is that if the map is "orthogonal" ie length and angle preserving, there is no rpoblem. if on the other hand it does not preserve angles, then the images of different orthognal bases make enough different angles with each other that you can arrange it so those images are orthogonal.

    of course you cannot arrange it so the images make angle zero with each other so probably what you can do is achieve an arc of angles symmetrical about the angle 90 degrees. just a guess.


    does this help?
     
    Last edited: Jun 28, 2005
  12. Jun 28, 2005 #11
    Thanks George! That explains it! :rofl:
    I'm not sure if I would have seen this myself eventually. Like you guessed, I was stuck on the [itex]A A^T[/itex] and [itex]A^T A[/itex]..
    I am aiming to become a physicist, but at the moment I'm not sure how to do this. Tried a few times as full time student, but the stress got to me and I dropped out. But that doesn't stop me from studying meanwhile. :smile:

    That's true. I think I did a good job of explaining it myself there. :biggrin: But I didn't make the connection.

    Thanks both.
     
  13. Jun 28, 2005 #12

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    it should be clear also from my post that if one finds an orthonormal basis v,w that is mapped to an orthogonal basis {f(u), f(w)} then the orthonormal basis one wants to take in the target is f(u)/|f(u)| and f(w)/|f(w)|, and not minus these.
     
  14. Jun 28, 2005 #13
    Sorry, didn't see your second post there mathwonk. I must say I'm not yet familiar with the term isomorphism, it's in my course text but haven't covered it yet. So I read your post 3 times.. :shy:

    But I think I understand what you mean. I'm gonna try this out tomorrow with a couple of vectors (bit tired now), just for fun. Thanks for the input.
     
  15. Jun 29, 2005 #14

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    from a zen-like perspective, we know there is a spectral theorem diagonalizing symmetric matrices, and in fact for every matrix M, the product MM* is symmetric. so ask yourself what the diagonal form of MM* says about M itself? i.e. what do the eigenvectors of MM* mean for M?

    (I have not done this, but it would be my natural next question. i am sure that understanding this is key to understanding this SVD theorem, which does not seem deep, once one knows the spectral theorem.)
     
  16. Aug 29, 2011 #15
    "BEYOND SVD"



    "Advanced Eigenvalue Vector Decomposition"


    We have made a significant improvement to the Singular Value
    Decomposition methodology, This is actually an understatement.

    We have discovered that the current Singular Value Decomposition mathematical techniques and resulting FORTRAN and C code is quite slow and inaccurate and what we have accomplished is to speed up computer execution by more than a factor of 1000 and to improve numerical accuracy by about 12 bits out of 48 bits for the standard double mantissa That is to say that there is no more than 1 bit different between the exact answer and my new solution method .Previous or current methods can barely achieve 24 bit accuracy (out of48).This improvement will make it possible to recognize red, green , blue, black, grey and white infrared images in real time as the first application .


    The range of applications for this new technology can go well
    beyond this.

    Of course, we expect skeptics about these claims, but we have demo
    programs that will read your data and produce results that they can compare and we can even executed this new code on their computers if they desire.



    How AEVD Improves SVD Performance



    AEVD Technology, LLC offers a fully developed, advanced form of the Singular Value Decomposition (SVD) algorithm, which offers a generational advance in speed and accuracy. Our Advanced Eigenvalue Vector Decomposition (or AEVD) was built upon a recognition of the shortcomings in how computers manipulate numbers, data and calculations, and reflects a painstaking analysis of the incremental steps in SVD processing to provide a faster process with fewer steps and improved inaccuracy.



    The SVD mathematical proposition is linearly independent, in an algebraic sense, of the similarity theorem and as such provides a variety of available solution paths. One such path is to first reduce the input array to a real bidiagonal matrix with a sequence of intermediate left and right unitary transformations. This reduction to a real bidiagonal matrix is usually chosen to be a real diagonal and having one real super diagonal. All of the remaining matrix elements are numerically considered as zero. It is possible to choose other reduced forms of the input matrix, but the use of a real bidiagonal array provides for the most numerically accurate and computationally rapid solution. Additional numerical stability and computer accuracy is obtained by AEVD by choosing unitary transformations that place the larger bidiagonal elements in the upper left and the smaller elements in the lower right positions. This is true since the final determination of the left and right transformations and the SVD weights are always an iterative process for matrix sizes greater than four. Even for matrix sizes of four and less iterative methods are usually simpler and require computational steps comparable to closed form algebraic solutions. Also, when a real bidiagonal array format is chosen as the final iterate, the left and right transformations employed during iteration are orthogonal matrices. Other SVD iterative solution methods available employ orthogonal variations such as Jacobi rotation, Givens, and Householder reduction algorithms. Consistent among the more efficient and accurate SVD algorithms is the choice of the real




    Regards,
    John Rawlins

    678-776-1343
     
  17. Aug 29, 2011 #16
    @OP:

    The eigenvectors of a matrix that is not Hermitian are not necessarily orthogonal. You need to use:
    [tex]
    A = U \, \Sigma \, U^{-1}
    [/tex]
    where [itex]U[/itex] is a matrix whose columns are the eigenvectors.
     
  18. Aug 29, 2011 #17
    Oh, I see. The matrix [itex]A_{m \times n}[/itex] is not necessarily square. Ok, the matrices:
    [tex]
    B_{n \times n} = A^{\dagger}_{n \times m} \, A_{m \times n}
    [/tex]
    and
    [tex]
    C_{m \times m} = A_{m \times n} \, A^{\dagger}_{n \times m}
    [/tex]
    are Hermitian (ex: [itex]B^{\dagger} = (A^{\dagger} \, A)^{\dagger} = A^{\dagger} \, (A^{\dagger})^{\dagger} = A^{\dagger} \, A = B[/itex]). These can be diagonalized:
    [tex]
    B_{n \times n} \, U_{n \times n} = U_{n \times n} \, \Lambda_{n \times n}, \; U^{\dagger} \, U = U \, U^{\dagger} = 1_{n \times n}
    [/tex]
    [tex]
    C_{m \times m} \, V_{m \times m} = V_{m \times m} \, K_{m \times m}, \; V^{\dagger} \, V = V \, V^{\dagger} = 1_{m \times m}
    [/tex]

    Furthermore, the eigenvalues are non-negative:
    [tex]
    \lambda_{i} = \Lambda_{
    i i} = (U^{\dagger} \, B \, U)_{i i} = (U^{\dagger} \, A^{\dagger} \, A \, U)_{i i} = ((A \, U)^{\dagger} \, (A \, U))_{i i} = \sum_{k = 1}^{m}{|A \, U|^{2}_{k i}} \ge 0
    [/tex]
    The eigenvalue [itex]\lambda_{i}[/itex] is zero iff the i-th column of U is in the (right) null space of A. Similarly, the eigenvalue [itex]\kappa_{k}[/itex] is zero iff the row of [itex]V^{\dagger}[/itex] is in the (left) null space of A.
     
    Last edited: Aug 29, 2011
  19. Aug 29, 2011 #18
    Next, take:
    [tex]
    A \, B \, U = A \, U \, \Lambda
    [/tex]
    [tex]
    (A \, A^{\dagger}) \, (A \, U) = (A \, U) \, \Lambda
    [/tex]
    [tex]
    C \, (A \, U) = (A \, U) \, \Lambda
    [/tex]
    [tex]
    V \, K \, (V^{\dagger} \, A \, U) = (A \, U) \, \Lambda
    [/tex]
    [tex]
    K \, (V^{\dagger} \, A \, U) = (V^{\dagger} \, A \, U) \, \Lambda
    [/tex]

    We didn't show this, but:
    [tex]
    \kappa_{k} = K_{k k} = (V^{\dagger} \, C \, V)_{k k} = (V^{\dagger} \, A \, A^{\dagger} \, V)_{k k} = ((V^{\dagger} \, A) (A^{\dagger} \, V))_{k k} = \sum_{i = 1}^{n}{|V^{\dagger} \, A|^{2}_{k i}}
    [/tex]
    so the set of eigenvalues of [itex]\Lambda[/itex] and [itex]K[/itex] are subsets to each other.

    This is why:
    [tex]
    V^{\dagger} \, A \, U = \Sigma_{m \times n}
    [/tex]
    where [itex]\Sigma_{m \times n}[/itex] is a diagonal with 1's along the main diagonal (as long as it streches).
    Then, we have:
    [tex]
    A = V_{m \times m} \, \Sigma_{m \times n} \, U^{\dagger}_{n \times n}
    [/tex]
     
  20. Aug 30, 2011 #19
    I have some mistake in the deduction for the diagonal elements of [itex]\Sigma[/itex].
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Singular Value Decomposition trouble
Loading...