How do I know if a matrix is positive definite?

  • Context: Undergrad 
  • Thread starter Thread starter Telemachus
  • Start date Start date
  • Tags Tags
    Matrix Positive
Click For Summary
SUMMARY

The discussion centers on determining the positive definiteness of a real tridiagonal symmetric matrix derived from the discretization of a partial differential equation. The matrix elements are defined using parameters such as ##\kappa## and ##\mu##, which are positive or zero. A key condition for positive definiteness is identified: if the product of off-diagonal elements ##A_{i,i+1}A_{i+1,i}>0##, then all eigenvalues ##\lambda_n>0##. The discussion concludes that the matrix is indeed positive definite, particularly due to its diagonal dominance and the positivity of its diagonal elements.

PREREQUISITES
  • Understanding of tridiagonal symmetric matrices
  • Familiarity with eigenvalues and eigenvectors
  • Knowledge of diagonal dominance in matrices
  • Basic concepts of Gerschgorin circle theorem
NEXT STEPS
  • Study the Gerschgorin circle theorem in detail
  • Learn about diagonal dominance and its implications for matrix properties
  • Explore numerical methods for verifying matrix positive definiteness
  • Investigate the application of LAPACK routines for symmetric tridiagonal matrices
USEFUL FOR

Mathematicians, numerical analysts, and engineers working with matrix computations, particularly those involved in solving partial differential equations and optimizing matrix properties for numerical stability.

Telemachus
Messages
820
Reaction score
30
Hi. I have a real tridiagonal symmetric matrix that comes from the discretization of a partial differential equation. The elements are given by:

##A_{i,j}=-\delta_{i-1,j}\kappa_{i-1/2,l}\frac{\Delta t}{h^2}+\delta_{i,j}\left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,j}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]-\delta_{i+1,j}\kappa_{i+1/2,l}\frac{\Delta t}{h^2}##.

##\kappa## and ##\mu## are discretized functions, and both are positive (##\mu## might be zero at some points, but never negative, and ##\kappa## is strictly positive). This is a symmetric tridiagonal matrix. I would like to know if it is positive definite.

Now I have read somewhere (I don't know where, but I took note) that if the product of the elements of the matrix:

(1) ##A_{i,i+1}A_{i+1,i}>0##,

then all eigenvalues of ##A##, let's say ##\lambda_n>0##, and therefore ##A## is positive definite, this is accomplished for my matrix. So is ##A## positive definite? I think it is under the assumption (1) I've made, but I don't know where the theorem that gives condition (1) and ensures that the eigenvalues are positive comes from.

Thanks in advance.
 
Physics news on Phys.org
Telemachus said:
Now I have read somewhere (I don't know where, but I took note) that if the product of the elements of the matrix:

(1) ##A_{i,i+1}A_{i+1,i}>0##,

then all eigenvalues of ##A##, let's say ##\lambda_n>0##, and therefore ##A## is positive definite, this is accomplished for my matrix.
I am not certain about the exact conditions of your assertion, but wouldn't the matrix
$$
A =
\begin{bmatrix}
-1& 1\\
1& 1
\end{bmatrix}
$$
with eigenvalues ##\lambda_{\pm} = \pm\sqrt{2}## be a counterexample?
 
  • Like
Likes   Reactions: jim mcnamara and Telemachus
Yes, you are right. My assertion and my notes are actually wrong. Thank you.

Is there anyway to know if my matrix will be positive definite in general under the conditions stated above? one difference I note between my case and your counter example is that in the matrix I've posted all the elements in the principal diagonal are strictly positive.
 
Last edited:
Telemachus said:
Is there anyway to know if my matrix will be positive definite in general under the conditions stated above? one difference I note between my case and your counter example is that in the matrix I've posted all the elements in the principal diagonal are strictly positive.

For ##2 \times 2## matrices
$$
A =
\begin{bmatrix}
a& c\\
c& b
\end{bmatrix}
$$
you can make counterexamples using that ##\text{det}\,{A} = ab - c^2 = \lambda_1\lambda_2## and ##\text{tr}\,{A} = a + b = \lambda_1 + \lambda_2##. So, for example, you can pick ##a = b = 1## to have a strictly positive main diagonal and then any real ##c## with ##c^2 > 1## to deduce that exactly one of the eigenvalues must be negative.

It could of course be that your specific matrix has more additional structure that you can exploit to get positive definiteness, but I did not check this. I mainly became curious about the general statement you made in your OP.
 
  • Like
Likes   Reactions: Telemachus
Telemachus said:
in the matrix I've posted all the elements in the principal diagonal are strictly positive.

Is your matrix diagonally dominant? If so, then that is enough here. (There is a strong interlocking of Gerschgorin's discs and diagonally dominant matrices)
 
  • Like
Likes   Reactions: Telemachus and S.G. Janssens
StoneTemplePython said:
Is your matrix diagonally dominant? If so, then that is enough here. (There is a strong interlocking of Gerschgorin's discs and diagonally dominant matrices)

The diagonal term is clearly bigger than the non diagonal terms, because it contains the non diagonal terms. I think it is diagonal dominant, but I'm not completely sure, I have that:

##A_{i,i}=\left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,i}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]##
##A_{i,i+1}=-\kappa_{i+1/2,l}\frac{\Delta t}{h^2}##
##A_{i-1,i}=-\kappa_{i-1/2,l}\frac{\Delta t}{h^2}##

So ##\sum_{i \neq j}|A_{i,j}|=\kappa_{i-1/2,l}\frac{\Delta t}{h^2}+\kappa_{i+1/2,l}\frac{\Delta t}{h^2} \leq \left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,i}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]=A_{i,i}##

So, I think it actually is.

And there is further more, I don't know what the Gerschgorin's discs are, but I think you are right. And that's because I've used a routine on lapack to solve symmetric tridiagonal positive definite matrix (I wanted to know if the structure of the matrix was that in general to be able to use this routine), and I have the same result that when I use the routine for general tridiagonal matrices. However, I needed to know if this was general enough to be able to use this subroutine in any cases for the given functions, and I think that after this the answer is yes. So thank you all very much.

Edit, after wikipedia:
Is this diagonal dominant
enough to apply Gershgorin circle theorem? I think it is, because the given radius for the eigenvalues seems to not let them be negative in any case. Thank you very much!
 
Last edited:
  • Like
Likes   Reactions: S.G. Janssens
Telemachus said:
So ##\sum_{i \neq j}|A_{i,j}|=\kappa_{i-1/2,l}\frac{\Delta t}{h^2}+\kappa_{i+1/2,l}\frac{\Delta t}{h^2} \leq \left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,i}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]=A_{i,i}##

Recalling that your diagonal elements of your real symmetric matrix are all positive valued: The above is the key thing and tells you all you need to know.

Two approaches from here:

1.) Graphically:
Draw a circle with a center at each of these points given by ##A_{i,i}##. (The backdrop is the complex plane.) Your radius of each respective circle is given by ##\sum_{i \neq j}|A_{i,j}|##. In each and every case you'll see that the radius is so small it never crosses past zero into negative territory, hence you have no negative eigenvalues. Technically you should now shade in your circles and call them 'discs'.

(There are technical nits about unions of discs and how to think about disjoint discs, but that's not really the point -- the big idea is that each and every eigenvalue must be in some shaded area -- i.e. in some disc.)

The above proves that your matrix has no negative eigenvalues -- i.e. positive semi-definiteness. With a bit of legwork you should be able to demonstrate your matrix is non-singular and hence positive definite. Note: If the matrix is strictly diagonally dominant, i.e. ##\sum_{i \neq j}|A_{i,j}| \lt \vert A_{i,i} \vert## then such a (square) matrix is always invertible.

2.) More in-depth: For a nice walk-through of Gerschgorin discs, take a look at:

https://golem.ph.utexas.edu/category/2016/08/in_praise_of_the_gershgorin_di.html
 
Last edited:
  • Like
Likes   Reactions: Telemachus and S.G. Janssens
Let me ask you one more question. If I added some boundary conditions, in some rows I would have ones in the diagonal, and zeros everywhere else, and in the other rows the matrix would still the same as before. In that case the matrix still positive definite?

I think it is straight forward, and the conclusion is the same than before, but I'm tired, and been having some numerical troubles the whole day, then I wanted to ask.
 
Telemachus said:
Let me ask you one more question. If I added some boundary conditions, in some rows I would have ones in the diagonal, and zeros everywhere else, and in the other rows the matrix would still the same as before. In that case the matrix still positive definite?

I think it is straight forward, and the conclusion is the same than before, but I'm tired, and been having some numerical troubles the whole day, then I wanted to ask.

do you mean creating the below matrix ##\mathbf B##, where

##\mathbf B = \begin{bmatrix}
\mathbf A & \mathbf 0^T \\
\mathbf 0 & \mathbf I
\end{bmatrix}##

note that ##\mathbf B^T = \mathbf B##

Then yes, because you're just appending eigenvalues of 1 and keeping symmetry intact. If I read you right you may not have quite as clean a blocked structure, but you're looking to in effect do the same thing (i.e. keep symmetry and diagonal dominance intact, and append some eigenvalues of 1 to the matrix), which gets you the same result.
 
  • Like
Likes   Reactions: Telemachus
  • #10
StoneTemplePython said:
do you mean creating the below matrix ##\mathbf B##, where

##\mathbf B = \begin{bmatrix}
\mathbf A & \mathbf 0^T \\
\mathbf 0 & \mathbf I
\end{bmatrix}##

note that ##\mathbf B^T = \mathbf B##

Then yes, because you're just appending eigenvalues of 1 and keeping symmetry intact. If I read you right you may not have quite as clean a blocked structure, but you're looking to in effect do the same thing (i.e. keep symmetry and diagonal dominance intact, and append some eigenvalues of 1 to the matrix), which gets you the same result.
Yes, that's it. I don't have that clean structure indeed. I have something more like:

##\mathbf B = \begin{bmatrix}
\mathbf I & \mathbf 0^T & \mathbf 0^T \\
\mathbf 0 & \mathbf A & \mathbf 0^T \\
\mathbf 0 & 0 & \mathbf I
\end{bmatrix}##

Where also A now has some rows such that ##A_{i,j}=\delta_{i,j}##. But I think I can still apply the theorem you've mentioned, I don't have any non diagonal elements on those rows, so the eigenvalues corresponding to those rows are just 1, right?. However, I think I'll change the strategy because the matrix with this structure is not giving me good results, I think I will put the boundary conditions as a vector on the right hand side, and keep the structure of the matrix as I've gave it at the beginning.
 
  • #11
Telemachus said:
Where also A now has some rows such that ##A_{i,j}=\delta_{i,j}##. But I think I can still apply the theorem you've mentioned, I don't have any non diagonal elements on those rows, so the eigenvalues corresponding to those rows are just 1, right?

This is right.

note that in context of eigenvalues, you are interested in making the matrix of ##\big(\mathbf B - \lambda_k \mathbf I\big)## singular. There's lots of ways to formalize this (I like blocked multiplication and traces of successive powers of ##\mathbf B##) but you really should just be able to eyeball this and see that any of your old ##\lambda##'s would still make the matrix not-invertible, and of course a ##\lambda_k = 1## will surely do the trick.
 
  • Like
Likes   Reactions: Greg Bernhardt and Telemachus
  • #12
Thank you very much for all of your help and patience, this was very constructive for me.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K