Have you tried using ridge regression to solve for a singular Jacobian matrix?

  • Context: Graduate 
  • Thread starter Thread starter top40
  • Start date Start date
  • Tags Tags
    Method Newton
Click For Summary

Discussion Overview

The discussion revolves around the challenges of solving a nonlinear system of equations using the Newton-Raphson method, particularly when faced with a singular Jacobian matrix. Participants explore various strategies to address the singularity issue, including LU decomposition, singular value decomposition (SVD), and ridge regression (Tikhonov regularization).

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant describes the Newton-Raphson method and the issue of obtaining a singular Jacobian matrix, which prevents finding its inverse.
  • Another participant suggests that the singular matrix may relate to a vector analog where the derivative equals zero.
  • A participant references Wikipedia to propose that instead of computing the inverse, one could solve the linear system directly, but expresses concern about the potential for leading principal minors to be zero.
  • One participant asserts that if the Jacobian is correctly calculated, the system may have no solution or an infinite number of solutions, and mentions that SVD could be used for a least norm solution.
  • Another participant elaborates on the conditions under which a system of linear equations has unique, no, or infinite solutions based on the rank of the matrix.
  • There is a discussion about the implications of LU decomposition for singular matrices, with participants agreeing that such matrices do not have unique solutions.
  • A participant introduces ridge regression as a potential method to address the singularity by adding a small quantity to the diagonal of the Jacobian, noting mixed results from this approach.

Areas of Agreement / Disagreement

Participants generally agree that a singular Jacobian matrix leads to complications in finding solutions, but multiple competing views remain regarding the best methods to address the singularity, including LU decomposition, SVD, and ridge regression. The discussion remains unresolved regarding the effectiveness of these methods in this specific context.

Contextual Notes

Participants express uncertainty about the proper calculation of the Jacobian and the implications of singularity on the existence of solutions. There are also limitations noted regarding the choice of initial guesses and the potential for misleading results when using ridge regression.

top40
Messages
5
Reaction score
0
I have a nonlinear system of equations and am using the Newton raphson method to solve for them. The method is basically as follows:

J*dx = -F, where J is the jacobian matrix, dx is a vector whose elements are the changes to x, F is a vector whose elements are the function values at x.

To find dx, I find the inverse of J and multiply it by both sides, yielding dx = J^-1 * -F

The only problem is that I keep getting a singular matrix J, so I can't find the inverse. Any suggestions?
 
Physics news on Phys.org
I can't answer directly, but it looks like you've encountered a vector analog of using this method where the derivative = 0 (one dim.).
 
I just read wikipedia, and this is what it had to say:

"Rather than actually computing the inverse of this matrix [they are talking about J], one can save time by solving the system of linear equations

F * dx = -F

for the unknown dx."

So then I did more reading and found that a singular matrix CAN be LU decomposed: "If the matrix is singular, then an LU factorization may still exist. In fact, a square matrix of rank k has an LU factorization if the first k leading principal minors are non-zero, although the converse is not true."

So basically, instead of finding the inverse of J and multiplying both sides by J^-1, maybe I need to decompose J and use back substitution. However, with my luck, I'll bet that some of the first k leading principal minors will be 0, making the matrix undecomposable.
 
Assuming J is properly calculated (you may want to verify) then there is either no solution or an infinite # of solutions. I doubt LU will get around the problem. In the case of infinite # solutions, I believe SVD (singular value decomposition) can be used to find a 'least norm' solution though I don't know the details of how. It could also be that you need a different 'initial guess' for starting point for the solution. These kinds of problems can be sensitive to starting point.

If you want to post (or PM) the equations and they aren't too nasty, I'd be happy to take a look at them.
 
Last edited:
hotvette said:
Assuming J is properly calculated (you may want to verify) then there is either no solution or an infinite # of solutions.

Why is that?
 
A system of linear equations Ax = b either has a unique solution, no solution, or an infinite number of solutions depending on the rank of [A] compared to the rank of the augmented matrix [A b] and the number of unknowns. The following link might be helpful.

http://www.facstaff.bucknell.edu/maneval/help211/linalg.html

Here are some simple examples to illustrate.

1. The following system has a unique solution of x=1, y=2 (i.e. the matrix A has an inverse):

x + 2y = 5
2x + 3y = 8


2. The following system is singular (i.e. the matrix A has no inverse):

x + 2y = 5
2x + 4y = 12

and has no solution because the 2nd equation simplifies to x + 2y = 6 which clearly conflicts with the first equation x + 2y = 5.


3. The following system is also singular:

x + 2y = 5
2x + 4y = 10

Both equations simplify to x + 2y = 5 and there are infinitely many solutions (for any arbitrary value for x you can determine a corresponding value for y). Even though there are an infinite number of solutions, there is one solution (x=1, y=2) that results in a minimum norm (i.e. minimum x2 + y2).

Thus, if J is singular, there is either no solution or infinitely many solutions.
 
Last edited:
So a matrix that 1) can be LU decomposed and 2) does not have an inverse

does not have a unique solution?
 
top40 said:
So a matrix that 1) can be LU decomposed and 2) does not have an inverse does not have a unique solution?

Correct. In my previous post, the LU decomposition exists for each example but that doesn't help in the last two examples where the matrix is singular.
 
There is something you can try that may or may not be useful. It's called ridge regression (also called Tikhonov regularization) that adds a small numerical quantity to the diagonals of the singular matrix. If poor choice of starting point is the reason J is singular, it might help the solution eventually converge to a point where you can reduce the added quantity to zero and end up with a valid solution. But, it may also produce results that make no sense.

In the case of problem #3 in my previous post, adding 0.001 to the diagonal produces the result x=0.9998, y=1.9996, which is a solution that makes sense. However, for problem #2, the result is x=-798.83, y=402.32 which doesn't make any sense.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K