Linear Systems with pseudoinverse matrix using SVD

In summary, the system of linear equations described in the conversation can be solved using the pseudoinverse matrix by using SVD. It involves finding a 3x3 matrix M with nine unknowns through 9 linear equations. If A is invertible, there is a unique solution; if not, there is a vector space of solutions. The SVD method is used to compute the pseudo inverse A^{+}, which provides a formula for finding the vector space of solutions. Standard linear algebra texts offer methods for computing U, D, and V, but these may not be efficient for larger matrices.
  • #1
Phong
5
0
Hi!

I have a question concerning solving a system of linear equations. I know that the pseudoinverse matrix by using SVD is useful for this, but haven't gotten the pieces together yet.

Let's assume I have this system of linear equations with each equation having one 3-component vector (V1) needed to be transformed by a matrix (M) to match a different 3-component vector (V2):

V1 x M = V2
V3 x M = V4
V5 x M = V6

where each V has three components x,y,z. How do I solve for M?
I know that SVDs come handy here, but I have not used them before, so I'd be curious for any help.

Thanks,


Nhat
 
Physics news on Phys.org
  • #2
solving for a matrix

Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

So now you are simply trying to figure out if each of the 3x3 matrices t is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M -- or show there are none, or produce the vector space of solutions.

Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In your 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
 
Last edited:
  • Like
Likes 1 person
  • #3
solving for a matrix

Since you have a 3x3 matrix M it has nine entries which must be determined. Each of your 3 equations generates a set of 3 linear equations in 9 unknowns. So you end up with 9 linear equations in 9 unknowns. Call this system Ax = b.

If A is invertible your system has a solution; if A is not invertible your system will either yield a vector space of solutions or no solution. Ordinarily it is a pain to figure out whether a 9x9 matrix is invertible, but in this case A is a special 9x9, because it really breaks down to 3 sets of equations in 3 unknowns.

For example, suppose (a,b,c) is the first row of M and (1,2,3) is the first vector V1. Then their dot product is a + 2b + 3c; but the other elements of M are not involved in the subsequent dot products with V1. The a,b and c show up again as the first equation for the second vector and the first equation for the 3rd vector.

So now you are simply trying to figure out of each of the 3x3 matrices you got is invertible. You can triangularize each one and see -- if it is this method will also solve for the various components of M.

Now pseudo inverses: If A is invertible you don't need them, because you have a real inverse and a unique solution for every Ax = b. If A is not invertible then you either have no solutions or a vector space of solutions. In you 3x3 case you can find that vector space with direct computation, and that would be the easiest thing.

The pseudo inverse A[itex]^{+}[/itex] provides a formula for computing that vector space, if you can compute A[itex]^{+}[/itex]. The SVD method is the usual way to compute it. In this method you find matrices U and V such that M = UDV[itex]^{*}[/itex] where D is a diagonal matrix with non-negative real entries, U and V are unitary matrices and V[itex]^{*}[/itex] is the Hermitian transpose of V. Then M[itex]^{+}[/itex] = VD[itex]^{+}[/itex]U[itex]^{*}[/itex]. The standard linear algebra texts provide methods of computing U, D and V, which work okay if the order of the matrix is not large.
 
Last edited:
  • Like
Likes 1 person

1. What is the pseudoinverse matrix?

The pseudoinverse matrix is a generalization of the inverse matrix for non-square matrices. It is used to find the "best" solution to a system of linear equations that does not have an exact solution.

2. How is the pseudoinverse matrix calculated?

The pseudoinverse matrix of a matrix A is calculated using the singular value decomposition (SVD) method. This involves finding the eigenvalues and eigenvectors of A*A and A*A*, where * indicates the conjugate transpose. The pseudoinverse matrix is then constructed using these eigenvalues and eigenvectors.

3. What is the application of pseudoinverse matrices?

Pseudoinverse matrices are used in a variety of applications, including data analysis, image processing, and control systems. They can also be used to solve overdetermined systems of linear equations, where there are more equations than unknowns.

4. How is the pseudoinverse matrix used in linear systems?

In linear systems, the pseudoinverse matrix is used to find the least-squares solution to a system of equations. This means that it minimizes the sum of the squared differences between the actual values and the predicted values. It can also be used to find a solution to an underdetermined system of equations, where there are more unknowns than equations.

5. What are the advantages of using the pseudoinverse matrix?

The pseudoinverse matrix allows for the solution of systems of equations that do not have an exact solution. It also provides a way to find the "best" solution to a system of equations, even if there are errors or noise in the data. Additionally, it is a computationally efficient method for solving linear systems.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
852
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
6
Views
847
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
911
  • Linear and Abstract Algebra
Replies
34
Views
2K
Replies
27
Views
1K
Back
Top