Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Finding the Inv. of a rank deficient Matrix

  1. Apr 27, 2010 #1
    I have the following problem:

    A * Phi = Ax' * Sx + Ay' * Sy

    where,
    A= Ax' * Ax + Ay' * Ay + Axy' * Axy

    and I would like to solve for Phi.

    Matrix A is:
    1)symmetric
    2) [89x89]
    3) Rank(A)=88 ( I guess it means that there is no unique solution )
    4) Det(A)~=0 ( I guess it means that A is not Singular )
    5) A is a Sparse Matrix ( 673 (%8.5) non-zero elements out of 7921 )

    How can I find the inverse of A and solve for Phi ?

    Thank you
     
  2. jcsd
  3. Apr 28, 2010 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    You don't. A "rank deficient" matrix which necessarily has determinant 0 does NOT have an inverse!

    Now, what are "Ax' ", "Sx", etc.?
     
  4. Apr 28, 2010 #3
    Det(A) is NOT equal to zero.

    I thought that I can solve it using SVD or pseudo-inverse, but I do not know how to implement this solution.

    I am trying to find the least square phase ( Phi ) fit to a given slopes slopes.

    Ax, Ay, and Axy are derivative matrix operators.

    Can I Impose a B.C to solve the problem ? and how I do this ?

    Thank you
     
  5. Apr 28, 2010 #4

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If the determinant is not zero the matrix has to have full rank. You should double check that the problem is intended as written
     
  6. Apr 28, 2010 #5
    the determinant is not zero BUT it is almost -Inf.
     
  7. Apr 28, 2010 #6

    jasonRF

    User Avatar
    Science Advisor
    Gold Member

    This is a problem you are solving numerically on a computer, yes?

    How are you estimating the rank?

    How did you compute the determinant?

    What does "the determinant is not zero BUT it is almost -Inf" mean?

    It sounds like at the very least this is a poorly conditioned matrix. Yes, the pseudo-inverse can be used for these kinds of things to give a kind of solution (although I do not understand your notation at all so don't know the exact problem you are trying to solve ...). Wikipedia has a reasonable page on the Moore-Penrose pseudoinverse that may be worth your while.

    jason
     
  8. Apr 28, 2010 #7
    I am running a Matlab code to solve the problem.
    The rank and Determinant are estimated using Matlab commands 'rank' and 'det'

    By saying that "the determinant is not zero BUT it is almost -Inf" I mean that the result of det(A) is around -1e24.

    I am solving the following problem:

    [tex]\vec{S}[/tex] = [tex]\nabla Phi[/tex]

    Where
    [tex]\vec{S}[/tex] = Sx [tex]\hat{x}[/tex] + Sy [tex]\hat{y}[/tex]

    Ax, Ay And Axy are matrix operators such as:
    Ax=[tex]\partial[/tex] / [tex]\partial[/tex] x
    Ay=[tex]\partial[/tex] / [tex]\partial[/tex] y
    Axy=[tex]\partial[/tex] / [tex]\partial[/tex] x [tex]\partial[/tex] y

    Solving the least squares estimate I end out with the following equation:
    A * Phi = Ax[tex]^{T}[/tex] * Sx + Ay[tex]^{T}[/tex]* Sy

    where,
    A=Ax[tex]^{T}[/tex] * Ax + Ay[tex]^{T}[/tex] * Ay + Axy[tex]^{T}[/tex] * Axy

    Hence I need the inverse of A to find Phi.
    such that:
    Phi = A[tex]^{-1}[/tex] * Ax[tex]^{T}[/tex] * Sx + A[tex]^{-1}[/tex] * Ay[tex]^{T}[/tex]* Sy

    But Matrix A is rank deficient and I can not find the the inverse in the normal way.
     
  9. Apr 28, 2010 #8

    jasonRF

    User Avatar
    Science Advisor
    Gold Member

    I still don't understand your problem. From what you wrote [tex]Phi[/tex] a scalar function? Thus [tex]\mathbf{S}[/tex] is a vector (column vector?). Ax, Ay, Axy are all matrices, so Ax*sx is a vector??? ......

    So to me your equation looks like: matrix * scalar = vector.

    Which seems crazy, since you state that the matrix A is square ....

    What are the dimensions of each term in your equation? It still makes absolutely no sense to me.

    jason
     
  10. Apr 28, 2010 #9

    jasonRF

    User Avatar
    Science Advisor
    Gold Member

    For some reason the fact that you are doing least squares just sunk in ....

    I still do not understand your problem at all, but my hunch is you are doing least-squares and are showing us the "normal equations" that you derive that way. That approach is usually very poor numerically. If you have a linear least squares problem with a set of parameters [tex] \matbf{x} [/tex] (n x 1)that you want to solve to minimize
    [tex] | A \mathbf{x} - \mathbf{b} |^2 [/tex],
    where matrix [tex]A[/tex] (m x n, with m>n) and the vector [tex]\mathbf{b}[/tex] (m x 1) are both known, then in Matlab you solve it by using mldivide (the \ operator). Just like this:

    x = A\b;

    Inside matlab, type

    doc mldivide

    to see the algorithm it uses. For the m>n case (usual least-squares) it uses the QR factorization which is a reasonable way to do least squares.

    My hunch is that you are trying to solve (using matlab notation)
    A'*A*x = A'*b

    Mathematically that is the exact way to solve the least squares problem, but it is usually a disaster numerically since A'*A often has a very poor condition number and can be unstable.

    jason
     
    Last edited: Apr 28, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Finding the Inv. of a rank deficient Matrix
  1. Rank of a matrix (Replies: 1)

  2. Rank of a matrix (Replies: 5)

  3. Rank of matrix (Replies: 2)

  4. Rank of a matrix (Replies: 3)

  5. Rank of a matrix (Replies: 1)

Loading...