Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Systems with pseudoinverse matrix using SVD

  1. Aug 27, 2013 #1
    Hi!

    I have a question concerning solving a system of linear equations. I know that the pseudoinverse matrix by using SVD is useful for this, but haven't gotten the pieces together yet.

    Let's assume I have this system of linear equations with each equation having one 3-component vector (V1) needed to be transformed by a matrix (M) to match a different 3-component vector (V2):

    V1 x M = V2
    V3 x M = V4
    V5 x M = V6

    where each V has three components x,y,z. How do I solve for M?
    I know that SVDs come handy here, but I have not used them before, so I'd be curious for any help.

    Thanks,


    Nhat
     
  2. jcsd
  3. Sep 2, 2013 #2
    solving for a matrix

    Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

    If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

    For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

    So now you are simply trying to figure out if each of the 3x3 matrices t is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M -- or show there are none, or produce the vector space of solutions.

    Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In your 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

    The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
     
    Last edited: Sep 2, 2013
  4. Sep 2, 2013 #3
    solving for a matrix

    Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

    If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

    For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

    So now you are simply trying to figure out of each of the 3x3 matrices you got is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M.

    Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In you 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

    The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
     
    Last edited: Sep 2, 2013
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Linear Systems with pseudoinverse matrix using SVD
  1. Why use SVD? (Replies: 10)

Loading...