1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

SVD and orthonormal bases

  1. Apr 4, 2015 #1
    Problem:

    Suppose u1,...un and v1,....vn are orthonormal bases for Rn. Construct the matrix A that transforms each vj into uj to give Av1=u1,....Avn=un.

    Answer key says A=UV^T since all σj=1. Why is all σj=1?
     
  2. jcsd
  3. Apr 5, 2015 #2

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    What does the fact that the two bases are orthonormal tell you about A? Can you use the answer to that somehow?
     
    Last edited: Apr 5, 2015
  4. Apr 5, 2015 #3
    I regrouped and thought about this...

    if A=UΣVT then AV=(UΣVT)V = UΣ

    If Σ=I then AV=U and σ=1.

    I think that is it. Thoughts? Much thanks.
     
  5. Apr 6, 2015 #4

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This is correct, but your argument looks kind of circular. It looks like you had been told that ##\Sigma=I## because all the ##\sigma_i## are 1, and wanted to know why all the ##\sigma_i## are all 1. Now you're saying that they're all 1 because ##\Sigma=I##. You can use ##\sigma_i=1## to explain ##\Sigma=I##, or you can use ##\Sigma=I## to explain ##\sigma_i=1##, but you can't do both.

    You're using a theorem that tells you that ##A=U\Sigma V^T##, and also defines ##U##, ##\Sigma## and ##V##, right? How does that theorem define ##\Sigma##? If it defines it as a diagonal matrix with the singular values of ##A## (=eigenvalues of ##\sqrt{A^*A}##) on the diagonal, then I would recommend that you ignore the theorem until you have figured out what the problem statement is telling you about A (and about ##A^*A##).

    By the way, the appropriate place for a question about a textbook-style problem about linear algebra is the calculus & beyond homework forum.
     
    Last edited: Apr 6, 2015
  6. Apr 6, 2015 #5
    Sorry about the misplaced post. I wasn't working with the idea that Σ=I and thus σ=1 or vice-versa. I was working with the idea that AV has to equal U and the only way that could occur is if Σ=I. Even so, I accept your challenge of trying to dig deeper into the problem and so far, I have come up with the following:

    Av1=u1 means that u1 is a linear combination of the columns of A

    but u1 is part of a basis set, so by definition it cannot be made up of other bases. Therefore, A is a projection matrix. (Not sure about this, gotta think about it a little more...just thought I would include my thoughts.)
     
  7. Apr 7, 2015 #6

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    It does, but I don't think this will help you find the property of A that I had in mind.

    This is not correct. In fact, the only projection that takes an orthonormal basis to another is the identity map.

    I'll give you another (small) hint: Start by writing down the formula that says that ##\{u_i\}## is orthonormal.
     
  8. Apr 9, 2015 #7
    Okay, here is another kick at the can. If I want a matrix A that takes V and transforms it into U, such that AV=U where V and U are matrices whose columns are the orthonormal vectors ui,...un and vi,.....vn respectively, then I can isolate for A by taking the inverse of V, so that A=UV-1 but because V is orthonormal then A=UVT. So, if U and V are rotation matrices, then A would be the combination of those rotation matrices. I checked it with some 2x2 matrices and it seemed to work. Thoughts?
     
  9. Apr 10, 2015 #8

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This is correct. (Not exactly the method I had in mind, but close enough. We can discuss my method later). Now what does ##A=UV^T## tell you about the singular values of ##A##? What is your book's definition of a singular value?
     
  10. Apr 12, 2015 #9
    Well, my book and other readings from the internet seem to suggest that a singular value is similar to an eigenvalue. However, a singular value is a value which multiple an orthonormal basis to get the product of another orthonormal basis and a matrix A. ##A=UV^T## seems to suggest the singular value is 1. Question: Is the singular value always 1 in a SVD? It seems to be the case because the basis vectors U and V are always orthonormal.
     
  11. Apr 13, 2015 #10

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    According to "Linear algebra done wrong" by Sergei Treil (which can be downloaded for free online), the singular values of ##A## are the eigenvalues of the operator ##|A|##, defined by ##|A|=\sqrt{A^*A}##, where ##A^*## is the adjoint of ##A##. Knowing this makes it very easy to find the singular values in your problem (where ##A## takes one orthonormal basis to another).

    If this isn't how your book defines them, then it must have provided some other definition, or at least a way to calculate them.
     
    Last edited: Apr 13, 2015
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: SVD and orthonormal bases
  1. SVD question (Replies: 4)

  2. Orthonormal vectors (Replies: 13)

Loading...