Linear Systems with pseudoinverse matrix using SVD

Click For Summary
SUMMARY

The discussion focuses on solving a system of linear equations using the pseudoinverse matrix derived from Singular Value Decomposition (SVD). The equations involve a 3x3 matrix (M) that transforms 3-component vectors (V1, V2, V3, V4, V5, V6). It is established that if the matrix A, formed from the equations, is invertible, a unique solution exists; otherwise, the system may yield no solutions or a vector space of solutions. The SVD method is recommended for computing the pseudoinverse, where M can be expressed as M = UDV* and M+ = VD+U*.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically linear equations and matrices.
  • Familiarity with Singular Value Decomposition (SVD) and its applications.
  • Knowledge of pseudoinverse matrices and their significance in solving linear systems.
  • Basic proficiency in matrix operations, including dot products and triangularization.
NEXT STEPS
  • Study the process of computing the pseudoinverse using SVD in detail.
  • Learn about the properties and applications of unitary matrices in linear algebra.
  • Explore methods for determining the invertibility of matrices, particularly 9x9 matrices.
  • Investigate standard linear algebra texts that provide algorithms for computing U, D, and V in SVD.
USEFUL FOR

Mathematicians, data scientists, and engineers involved in solving linear systems, particularly those utilizing SVD for matrix computations and pseudoinverses.

Phong
Messages
5
Reaction score
0
Hi!

I have a question concerning solving a system of linear equations. I know that the pseudoinverse matrix by using SVD is useful for this, but haven't gotten the pieces together yet.

Let's assume I have this system of linear equations with each equation having one 3-component vector (V1) needed to be transformed by a matrix (M) to match a different 3-component vector (V2):

V1 x M = V2
V3 x M = V4
V5 x M = V6

where each V has three components x,y,z. How do I solve for M?
I know that SVDs come handy here, but I have not used them before, so I'd be curious for any help.

Thanks,


Nhat
 
Physics news on Phys.org
solving for a matrix

Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

So now you are simply trying to figure out if each of the 3x3 matrices t is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M -- or show there are none, or produce the vector space of solutions.

Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In your 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
 
Last edited:
  • Like
Likes   Reactions: 1 person
solving for a matrix

Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

So now you are simply trying to figure out of each of the 3x3 matrices you got is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M.

Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In you 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
 
Last edited:
  • Like
Likes   Reactions: 1 person

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K