Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I How do I know if a matrix is positive definite?

  1. Jul 31, 2017 #1
    Hi. I have a real tridiagonal symmetric matrix that comes from the discretization of a partial differential equation. The elements are given by:

    ##A_{i,j}=-\delta_{i-1,j}\kappa_{i-1/2,l}\frac{\Delta t}{h^2}+\delta_{i,j}\left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,j}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]-\delta_{i+1,j}\kappa_{i+1/2,l}\frac{\Delta t}{h^2}##.

    ##\kappa## and ##\mu## are discretized functions, and both are positive (##\mu## might be zero at some points, but never negative, and ##\kappa## is strictly positive). This is a symmetric tridiagonal matrix. I would like to know if it is positive definite.

    Now I have read somewhere (I don't know where, but I took note) that if the product of the elements of the matrix:

    (1) ##A_{i,i+1}A_{i+1,i}>0##,

    then all eigenvalues of ##A##, let's say ##\lambda_n>0##, and therefore ##A## is positive definite, this is accomplished for my matrix. So is ##A## positive definite? I think it is under the assumption (1) I've made, but I don't know where the theorem that gives condition (1) and ensures that the eigenvalues are positive comes from.

    Thanks in advance.
     
  2. jcsd
  3. Jul 31, 2017 #2

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    I am not certain about the exact conditions of your assertion, but wouldn't the matrix
    $$
    A =
    \begin{bmatrix}
    -1& 1\\
    1& 1
    \end{bmatrix}
    $$
    with eigenvalues ##\lambda_{\pm} = \pm\sqrt{2}## be a counterexample?
     
  4. Jul 31, 2017 #3
    Yes, you are right. My assertion and my notes are actually wrong. Thank you.

    Is there anyway to know if my matrix will be positive definite in general under the conditions stated above? one difference I note between my case and your counter example is that in the matrix I've posted all the elements in the principal diagonal are strictly positive.
     
    Last edited: Jul 31, 2017
  5. Jul 31, 2017 #4

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    For ##2 \times 2## matrices
    $$
    A =
    \begin{bmatrix}
    a& c\\
    c& b
    \end{bmatrix}
    $$
    you can make counterexamples using that ##\text{det}\,{A} = ab - c^2 = \lambda_1\lambda_2## and ##\text{tr}\,{A} = a + b = \lambda_1 + \lambda_2##. So, for example, you can pick ##a = b = 1## to have a strictly positive main diagonal and then any real ##c## with ##c^2 > 1## to deduce that exactly one of the eigenvalues must be negative.

    It could of course be that your specific matrix has more additional structure that you can exploit to get positive definiteness, but I did not check this. I mainly became curious about the general statement you made in your OP.
     
  6. Jul 31, 2017 #5

    StoneTemplePython

    User Avatar
    Gold Member

    Is your matrix diagonally dominant? If so, then that is enough here. (There is a strong interlocking of Gerschgorin's discs and diagonally dominant matrices)
     
  7. Jul 31, 2017 #6
    The diagonal term is clearly bigger than the non diagonal terms, because it contains the non diagonal terms. I think it is diagonal dominant, but I'm not completely sure, I have that:

    ##A_{i,i}=\left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,i}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]##
    ##A_{i,i+1}=-\kappa_{i+1/2,l}\frac{\Delta t}{h^2}##
    ##A_{i-1,i}=-\kappa_{i-1/2,l}\frac{\Delta t}{h^2}##

    So ##\sum_{i \neq j}|A_{i,j}|=\kappa_{i-1/2,l}\frac{\Delta t}{h^2}+\kappa_{i+1/2,l}\frac{\Delta t}{h^2} \leq \left[\frac{2}{c}+\frac{\Delta t}{2}\mu_{i,i}+\frac{\Delta t}{h^2}\left(\kappa_{i+1/2,l}+\kappa_{i-1/2,l}\right) \right]=A_{i,i}##

    So, I think it actually is.

    And there is further more, I don't know what the Gerschgorin's discs are, but I think you are right. And thats because I've used a routine on lapack to solve symmetric tridiagonal positive definite matrix (I wanted to know if the structure of the matrix was that in general to be able to use this routine), and I have the same result that when I use the routine for general tridiagonal matrices. However, I needed to know if this was general enough to be able to use this subroutine in any cases for the given functions, and I think that after this the answer is yes. So thank you all very much.

    Edit, after wikipedia:
    Is this diagonal dominant
    enough to apply Gershgorin circle theorem? I think it is, because the given radius for the eigenvalues seems to not let them be negative in any case. Thank you very much!!
     
    Last edited: Jul 31, 2017
  8. Jul 31, 2017 #7

    StoneTemplePython

    User Avatar
    Gold Member

    Recalling that your diagonal elements of your real symmetric matrix are all positive valued: The above is the key thing and tells you all you need to know.

    Two approaches from here:

    1.) Graphically:
    Draw a circle with a center at each of these points given by ##A_{i,i}##. (The backdrop is the complex plane.) Your radius of each respective circle is given by ##\sum_{i \neq j}|A_{i,j}|##. In each and every case you'll see that the radius is so small it never crosses past zero into negative territory, hence you have no negative eigenvalues. Technically you should now shade in your circles and call them 'discs'.

    (There are technical nits about unions of discs and how to think about disjoint discs, but that's not really the point -- the big idea is that each and every eigenvalue must be in some shaded area -- i.e. in some disc.)

    The above proves that your matrix has no negative eigenvalues -- i.e. positive semi-definiteness. With a bit of legwork you should be able to demonstrate your matrix is non-singular and hence positive definite. Note: If the matrix is strictly diagonally dominant, i.e. ##\sum_{i \neq j}|A_{i,j}| \lt \vert A_{i,i} \vert## then such a (square) matrix is always invertible.

    2.) More in-depth: For a nice walk-through of Gerschgorin discs, take a look at:

    https://golem.ph.utexas.edu/category/2016/08/in_praise_of_the_gershgorin_di.html
     
    Last edited: Jul 31, 2017
  9. Aug 1, 2017 #8
    Let me ask you one more question. If I added some boundary conditions, in some rows I would have ones in the diagonal, and zeros everywhere else, and in the other rows the matrix would still the same as before. In that case the matrix still positive definite?

    I think it is straight forward, and the conclusion is the same than before, but I'm tired, and been having some numerical troubles the whole day, then I wanted to ask.
     
  10. Aug 1, 2017 #9

    StoneTemplePython

    User Avatar
    Gold Member

    do you mean creating the below matrix ##\mathbf B##, where

    ##\mathbf B = \begin{bmatrix}
    \mathbf A & \mathbf 0^T \\
    \mathbf 0 & \mathbf I
    \end{bmatrix}##

    note that ##\mathbf B^T = \mathbf B##

    Then yes, because you're just appending eigenvalues of 1 and keeping symmetry intact. If I read you right you may not have quite as clean a blocked structure, but you're looking to in effect do the same thing (i.e. keep symmetry and diagonal dominance intact, and append some eigenvalues of 1 to the matrix), which gets you the same result.
     
  11. Aug 1, 2017 #10
    Yes, that's it. I don't have that clean structure indeed. I have something more like:

    ##\mathbf B = \begin{bmatrix}
    \mathbf I & \mathbf 0^T & \mathbf 0^T \\
    \mathbf 0 & \mathbf A & \mathbf 0^T \\
    \mathbf 0 & 0 & \mathbf I
    \end{bmatrix}##

    Where also A now has some rows such that ##A_{i,j}=\delta_{i,j}##. But I think I can still apply the theorem you've mentioned, I don't have any non diagonal elements on those rows, so the eigenvalues corresponding to those rows are just 1, right?. However, I think I'll change the strategy because the matrix with this structure is not giving me good results, I think I will put the boundary conditions as a vector on the right hand side, and keep the structure of the matrix as I've gave it at the beginning.
     
  12. Aug 1, 2017 #11

    StoneTemplePython

    User Avatar
    Gold Member

    This is right.

    note that in context of eigenvalues, you are interested in making the matrix of ##\big(\mathbf B - \lambda_k \mathbf I\big)## singular. There's lots of ways to formalize this (I like blocked multiplication and traces of successive powers of ##\mathbf B##) but you really should just be able to eyeball this and see that any of your old ##\lambda##'s would still make the matrix not-invertible, and of course a ##\lambda_k = 1## will surely do the trick.
     
  13. Aug 1, 2017 #12
    Thank you very much for all of your help and patience, this was very constructive for me.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted