Finding the Inv. of a rank deficient Matrix

In summary, the problem is that A is rank deficient and I do not know how to find the inverse of A. I am trying to find the least square phase ( Phi ) fit to a given slopes slopes.
  • #1
Shaddyab
19
0
I have the following problem:

A * Phi = Ax' * Sx + Ay' * Sy

where,
A= Ax' * Ax + Ay' * Ay + Axy' * Axy

and I would like to solve for Phi.

Matrix A is:
1)symmetric
2) [89x89]
3) Rank(A)=88 ( I guess it means that there is no unique solution )
4) Det(A)~=0 ( I guess it means that A is not Singular )
5) A is a Sparse Matrix ( 673 (%8.5) non-zero elements out of 7921 )

How can I find the inverse of A and solve for Phi ?

Thank you
 
Physics news on Phys.org
  • #2
You don't. A "rank deficient" matrix which necessarily has determinant 0 does NOT have an inverse!

Now, what are "Ax' ", "Sx", etc.?
 
  • #3
Det(A) is NOT equal to zero.

I thought that I can solve it using SVD or pseudo-inverse, but I do not know how to implement this solution.

I am trying to find the least square phase ( Phi ) fit to a given slopes slopes.

Ax, Ay, and Axy are derivative matrix operators.

Can I Impose a B.C to solve the problem ? and how I do this ?

Thank you
 
  • #4
If the determinant is not zero the matrix has to have full rank. You should double check that the problem is intended as written
 
  • #5
the determinant is not zero BUT it is almost -Inf.
 
  • #6
This is a problem you are solving numerically on a computer, yes?

How are you estimating the rank?

How did you compute the determinant?

What does "the determinant is not zero BUT it is almost -Inf" mean?

It sounds like at the very least this is a poorly conditioned matrix. Yes, the pseudo-inverse can be used for these kinds of things to give a kind of solution (although I do not understand your notation at all so don't know the exact problem you are trying to solve ...). Wikipedia has a reasonable page on the Moore-Penrose pseudoinverse that may be worth your while.

jason
 
  • #7
I am running a Matlab code to solve the problem.
The rank and Determinant are estimated using Matlab commands 'rank' and 'det'

By saying that "the determinant is not zero BUT it is almost -Inf" I mean that the result of det(A) is around -1e24.

I am solving the following problem:

[tex]\vec{S}[/tex] = [tex]\nabla Phi[/tex]

Where
[tex]\vec{S}[/tex] = Sx [tex]\hat{x}[/tex] + Sy [tex]\hat{y}[/tex]

Ax, Ay And Axy are matrix operators such as:
Ax=[tex]\partial[/tex] / [tex]\partial[/tex] x
Ay=[tex]\partial[/tex] / [tex]\partial[/tex] y
Axy=[tex]\partial[/tex] / [tex]\partial[/tex] x [tex]\partial[/tex] y

Solving the least squares estimate I end out with the following equation:
A * Phi = Ax[tex]^{T}[/tex] * Sx + Ay[tex]^{T}[/tex]* Sy

where,
A=Ax[tex]^{T}[/tex] * Ax + Ay[tex]^{T}[/tex] * Ay + Axy[tex]^{T}[/tex] * Axy

Hence I need the inverse of A to find Phi.
such that:
Phi = A[tex]^{-1}[/tex] * Ax[tex]^{T}[/tex] * Sx + A[tex]^{-1}[/tex] * Ay[tex]^{T}[/tex]* Sy

But Matrix A is rank deficient and I can not find the the inverse in the normal way.
 
  • #8
I still don't understand your problem. From what you wrote [tex]Phi[/tex] a scalar function? Thus [tex]\mathbf{S}[/tex] is a vector (column vector?). Ax, Ay, Axy are all matrices, so Ax*sx is a vector? ...

So to me your equation looks like: matrix * scalar = vector.

Which seems crazy, since you state that the matrix A is square ...

What are the dimensions of each term in your equation? It still makes absolutely no sense to me.

jason
 
  • #9
For some reason the fact that you are doing least squares just sunk in ...

I still do not understand your problem at all, but my hunch is you are doing least-squares and are showing us the "normal equations" that you derive that way. That approach is usually very poor numerically. If you have a linear least squares problem with a set of parameters [tex] \matbf{x} [/tex] (n x 1)that you want to solve to minimize
[tex] | A \mathbf{x} - \mathbf{b} |^2 [/tex],
where matrix [tex]A[/tex] (m x n, with m>n) and the vector [tex]\mathbf{b}[/tex] (m x 1) are both known, then in Matlab you solve it by using mldivide (the \ operator). Just like this:

x = A\b;

Inside matlab, type

doc mldivide

to see the algorithm it uses. For the m>n case (usual least-squares) it uses the QR factorization which is a reasonable way to do least squares.

My hunch is that you are trying to solve (using MATLAB notation)
A'*A*x = A'*b

Mathematically that is the exact way to solve the least squares problem, but it is usually a disaster numerically since A'*A often has a very poor condition number and can be unstable.

jason
 
Last edited:

1. What does it mean for a matrix to be rank deficient?

A matrix is considered rank deficient if its rank is less than its number of columns or rows. This means that the matrix has linearly dependent rows or columns, and therefore cannot be inverted.

2. Why is it important to find the inverse of a rank deficient matrix?

Finding the inverse of a matrix is important in many mathematical and scientific applications. In the case of a rank deficient matrix, it may not be possible to find a unique solution to a system of equations, so finding the inverse can help determine if a solution exists.

3. What are the challenges in finding the inverse of a rank deficient matrix?

The main challenge in finding the inverse of a rank deficient matrix is that it does not have a full set of linearly independent rows or columns. This means that the Gaussian elimination method, which is commonly used to find inverses, cannot be applied directly. Other methods, such as the Singular Value Decomposition (SVD) algorithm, must be used instead.

4. Can a rank deficient matrix always be inverted?

No, a rank deficient matrix cannot always be inverted. If the matrix has linearly dependent rows or columns, it is not possible to find a unique solution and therefore an inverse does not exist.

5. Are there any real-world applications where finding the inverse of a rank deficient matrix is important?

Yes, there are many real-world applications where finding the inverse of a rank deficient matrix is important. One example is in data analysis, where finding the inverse can help identify relationships between variables in a dataset. Another example is in electrical engineering, where finding the inverse of a matrix can help solve systems of linear equations in circuit analysis.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Precalculus Mathematics Homework Help
Replies
7
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
18
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
3K
  • Linear and Abstract Algebra
Replies
3
Views
4K
  • Linear and Abstract Algebra
Replies
7
Views
2K
Replies
4
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
10
Views
2K
Back
Top