# Hermitian, positive definite matrices

## Homework Statement

I am trying to prove the following:
if ##A\in C^{m\ \text{x}\ m}## is hermitian with positive definite eigenvalues, then A is positive definite. This was a fairly easy proof. The next part wants me to prove if A is positive definite, then ##\Delta_k##=\begin{bmatrix} a_{11} & \cdots & a_{1k} \\ \vdots & \ddots & \vdots\\ a_{k1} & \cdots & a_{kk} \end{bmatrix}

is also positive definite. k = 1...m

## The Attempt at a Solution

Since the first part I already proved if some matrix is hermitian with positive eigen values then that matrix is positive definite. I basically need to prove the ##\Delta_k## is hermitian with positive eigenvalues, therefore its positive definite.

It is obvious that ##\Delta_k## is hermitian since it is simply just a sub matrix of A, but how do I go about proving that this matrix also has positive eigenvalues? Any hints? Seems like it should be obvious, but I can't think of how to prove it.

Related Calculus and Beyond Homework Help News on Phys.org
Orodruin
Staff Emeritus
Homework Helper
Gold Member

## Homework Statement

I am trying to prove the following:
if ##A\in C^{m\ \text{x}\ m}## is hermitian with positive definite eigenvalues, then A is positive definite. This was a fairly easy proof. The next part wants me to prove if A is positive definite, then ##\Delta_k##=\begin{bmatrix} a_{11} & \cdots & a_{1k} \\ \vdots & \ddots & \vdots\\ a_{k1} & \cdots & a_{kk} \end{bmatrix}

is also positive definite. k = 1...m

## The Attempt at a Solution

Since the first part I already proved if some matrix is hermitian with positive eigen values then that matrix is positive definite. I basically need to prove the ##\Delta_k## is hermitian with positive eigenvalues, therefore its positive definite.

It is obvious that ##\Delta_k## is hermitian since it is simply just a sub matrix of A, but how do I go about proving that this matrix also has positive eigenvalues? Any hints? Seems like it should be obvious, but I can't think of how to prove it.
How does ##\Delta_k## act on vectors outside of the k-dimensional subspace spanned by the first k unit vectors? (If you use it as the upper left block of an mxm matrix which is otherwise zero)

How does ##\Delta_k## act on vectors outside of the k-dimensional subspace spanned by the first k unit vectors? (If you use it as the upper left block of an mxm matrix which is otherwise zero)
Hmm. Does this involve eigenvalues?

Or are you suggesting that I check ##x^T \Delta_k x##

Ray Vickson
Homework Helper
Dearly Missed
Hmm. Does this involve eigenvalues?

Or are you suggesting that I check ##x^T \Delta_k x##
Well that is exactly the DEFINITION of positive-definite.

Well that is exactly the DEFINITION of positive-definite.
Yes. I was thinking of just proving the eigenvalues are positive, based on the first part of the problem. If I think ##x^T \Delta_k x##, this would not involve eigenvalues in any way, right?

Ray Vickson
Homework Helper
Dearly Missed
Yes. I was thinking of just proving the eigenvalues are positive, based on the first part of the problem. If I think ##x^T \Delta_k x##, this would not involve eigenvalues in any way, right?
Well, the eigenvalues of a ##k \times k## submatrix may be different from any of those of the original matrix, so I cannot see how using eigenvalues will help; you no longer know they are all > 0!

Well, the eigenvalues of a ##k \times k## submatrix may be different from any of those of the original matrix, so I cannot see how using eigenvalues will help; you no longer know they are all > 0!
Hmm. I see. I thought there was some sort of relation between the eigen values of the original matrix and a sub-top-left matrix. I guess not.
If I do ##x^T \Delta_k x##, is there a simple way to show this for a generic ##\Delta_k## matrix? I can write out the vector-matrix multiplication & summing them up to show its >0, but that seems tedious. I am wondering if there's a much simpler way to show this.

Ray Vickson
Homework Helper
Dearly Missed
Hmm. I see. I thought there was some sort of relation between the eigen values of the original matrix and a sub-top-left matrix. I guess not.
If I do ##x^T \Delta_k x##, is there a simple way to show this for a generic ##\Delta_k## matrix? I can write out the vector-matrix multiplication & summing them up to show its >0, but that seems tedious. I am wondering if there's a much simpler way to show this.
There might be such eia relation, but I have never heard of one, and I doubt it would be easy to find even if it exists.

I honestly do not understand your difficulty; it all follows in one line and with almost zero work from the definition of positive-definiteness of the original matrix ##A##.

There might be such eia relation, but I have never heard of one, and I doubt it would be easy to find even if it exists.

I honestly do not understand your difficulty; it all follows in one line and with almost zero work from the definition of positive-definiteness of the original matrix ##A##.
Hmm, I am not seeing this.
Since A is positive definite, that means:
##x^T A x > 0## for all ##x\neq 0##
But I am not sure how this would relate to ##\Delta_k##. There's something I'm not visualizing. Are you looking at this from an element by element multiplication perspective?

Nathanael
Homework Helper
@pyroknife Have you considered Orodruins hint?

Think about the mxm matrix which is all zeros except the upper left corner is Δk ... how is this matrix related to A?

@pyroknife Have you considered Orodruins hint?

Think about the mxm matrix which is all zeros except the upper left corner is Δk ... how is this matrix related to A?
The matrix with all zeros except the upper left corner is ##\Delta_k## is essentially A, when k = m. Should I visualizing the submatrices ##\Delta_k## a matrix of zeros except with the upper left corner as ##\Delta_k##?

Nathanael
Homework Helper
Let me use ##\Delta_k^{mxm}## to represent the mxm matrix of all zeros except the upper left corner is Δk (just so I can reference it easily).
The matrix with all zeros except the upper left corner is ##\Delta_k## is essentially A, when k = m.
Ok, true. But in general, when k ≠ m, how do you get from A to ##\Delta_k^{mxm}##?
What I mean specifically is, what matrix takes you from A to ##\Delta_k^{mxm}##?

Let me use ##\Delta_k^{mxm}## to represent the mxm matrix of all zeros except the upper left corner is Δk (just so I can reference it easily).

Ok, true. But in general, when k ≠ m, how do you get from A to ##\Delta_k^{mxm}##?
What I mean specifically is, what matrix takes you from A to ##\Delta_k^{mxm}##?
Oh. hmm, let's see. if k = m, then the matrix would just be identity.
If k < m, then this gets a little complicated. Let me see if I can get a relationship for this matrix. This matrix basically needs to introduce m-k columns&rows of zeros at the end of ##\Delta_k^{mxm}##

Ray Vickson
Homework Helper
Dearly Missed
The matrix with all zeros except the upper left corner is ##\Delta_k## is essentially A, when k = m. Should I visualizing the submatrices ##\Delta_k## a matrix of zeros except with the upper left corner as ##\Delta_k##?
You want to know if the quadratic form
$$Q_k(x_1, x_2, \ldots, x_k) = (x_1, x_2, \ldots, x_k) \pmatrix{a_{11} & a_{12}& \cdots & a_{ik} \\ a_{21} & a_{22} & \cdots & a_{2k}\\ \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & \cdots & a_{kk} } \pmatrix{x_1 \\ x_2 \\ \vdots \\x_k}$$
is always positive for ##(x_1, x_2, \ldots, x_k) \neq (0,0,\ldots, 0)##.

You want to know if the quadratic form
$$Q_k(x_1, x_2, \ldots, x_k) = (x_1, x_2, \ldots, x_k) \pmatrix{a_{11} & a_{12}& \cdots & a_{ik} \\ a_{21} & a_{22} & \cdots & a_{2k}\\ \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & \cdots & a_{kk} } \pmatrix{x_1 \\ x_2 \\ \vdots \\x_k}$$
is always positive for ##(x_1, x_2, \ldots, x_k) \neq (0,0,\ldots, 0)##.
Yes exactly. But it seems to show that the quadratic form is > 0, I'd need to carry out all the matrix multiplications and verify this is > 0?

Yes exactly. But it seems to show that the quadratic form is > 0, I'd need to carry out all the matrix multiplications and verify this is > 0?
You want to know if the quadratic form
$$Q_k(x_1, x_2, \ldots, x_k) = (x_1, x_2, \ldots, x_k) \pmatrix{a_{11} & a_{12}& \cdots & a_{ik} \\ a_{21} & a_{22} & \cdots & a_{2k}\\ \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & \cdots & a_{kk} } \pmatrix{x_1 \\ x_2 \\ \vdots \\x_k}$$
is always positive for ##(x_1, x_2, \ldots, x_k) \neq (0,0,\ldots, 0)##.
I have already showed that A is positive definite (not in this thread, but let's just assume I showed you guys).
Which means

##Q_m(x_1, x_2, \ldots, x_m) = (x_1, x_2, \ldots, x_m) \pmatrix{a_{11} & a_{12}& \cdots & a_{1m} \\
a_{21} & a_{22} & \cdots & a_{2m}\\
\vdots & \vdots & \ddots & \vdots \\
a_{k1} & a_{k2} & \cdots & a_{mm} }
\pmatrix{x_1 \\ x_2 \\ \vdots \\x_m} > 0##

Now I need to show that if I take out (m-k) elements from ##x## and m-k rows&columns from A, then this still yields positive definite matrix.

Nathanael
Homework Helper
Oh. hmm, let's see. if k = m, then the matrix would just be identity.
If k < m, then this gets a little complicated. Let me see if I can get a relationship for this matrix. This matrix basically needs to introduce m-k columns&rows of zeros at the end of ##\Delta_k^{mxm}##
You shouldn't have to go too far from the identity matrix to find it; maybe put some zeros in the right spots?

Let me try to hint at the proof in another way:
If ##\vec x ^TA\vec x >0## for all ##\vec x## then in particular it must be true for all ##\vec x## of the form ##\vec x^T = [x_1, ... , x_k, 0, ..., 0]## right?
(Sorry I don't know how to write a column vector in latex.)

You shouldn't have to go to far from the identity matrix to find it; maybe put some zeros in the right spots?

Let me try to hint at the proof in another way:
If ##\vec x ^TA\vec x >0## for all ##\vec x## then in particular it must be true for all ##\vec x## of the form ##\vec x^T = [x_1, ... , x_k, 0, ..., 0]## right?
(Sorry I don't know how to write a column vector in latex.)
Yes that is true. So we are one step closer, but the next part, if we add the (m-k) columns and rows of zeros, I am not sure why that would still yield greater than zero.
Edit: Never mind, I undertand that if we modify A and add the (m-k) columns and rows of zeros, it would still yield a positive definite matrix. ##x^T \Delta_k^{mxm} x = x_k^T \Delta_k x_k## is that correct?

where ##x_k = \begin{bmatrix}
x_1 \\
x_2 \\
\vdots\\
x_k
\end{bmatrix}
##

I think the above proves it, but I'm still thinking about the matrix, when multiplied to A, yields zeros in the last (m-k) columns&rows. I was thinking it should just be a slight modification of the identity matrix, but I can't get it to work.

Nathanael
Homework Helper
Are you convinced that ##\vec x^T\Delta_k^{mxm}\vec x>0## for all ##\vec x \in C^{m}## implies that ##\vec v^T\Delta_k\vec v>0## for all ##\vec v \in C^{k}## ?
(This was the one part of the proof that I'm not sure of the best way to prove, but it seems clear enough.)

If so, then the goal becomes to show ##\vec x^T\Delta_k^{mxm}\vec x>0##. This follows nicely from the positive-definiteness of A if you think about the matrix which takes you from A to ##\Delta_k^{mxm}##

Now what is that matrix? I'm not sure how to hint at it without just saying it, but like I said it's similar to the identity matrix except with some extra zeros...

Ray Vickson
Homework Helper
Dearly Missed
I think the above proves it, but I'm still thinking about the matrix, when multiplied to A, yields zeros in the last (m-k) columns&rows. I was thinking it should just be a slight modification of the identity matrix, but I can't get it to work.
$$Q_k(x_1, x_2, \ldots, x_k) = Q_m(x_1, x_2, \ldots, x_k, 0, 0 , \ldots 0)$$
No need to agonize over matrices; just write things out as ##Q_m = \sum_{i=1}^m \sum_{j=1}^m a_{ij} x_i x_j## and ##Q_k = \sum_{i=1}^k \sum_{j=1}^k a_{ij} x_i x_j##.

Nathanael
Are you convinced that ##\vec x^T\Delta_k^{mxm}\vec x>0## for all ##\vec x \in C^{m}## implies that ##\vec v^T\Delta_k\vec v>0## for all ##\vec v \in C^{k}## ?
(This was the one part of the proof that I'm not sure of the best way to prove, but it seems clear enough.)

If so, then the goal becomes to show ##\vec x^T\Delta_k^{mxm}\vec x>0##. This follows nicely from the positive-definiteness of A if you think about the matrix which takes you from A to ##\Delta_k^{mxm}##

Now what is that matrix? I'm not sure how to hint at it without just saying it, but like I said it's similar to the identity matrix except with some extra zeros...
Hold on, isn't
##x^T \Delta_k^{mxm} x = x_k^T \Delta_k x_k## ?

Because if so, then we have already proved ##\Delta_k ## is positive definite

Nathanael
Homework Helper
Hold on, isn't
##x^T \Delta_k^{mxm} x = x_k^T \Delta_k x_k## ?
Multiplication can only be defined on one side of that equation.
(x either has k components or m components; either way it can't multiply both ##\Delta_k^{mxm}## and ## \Delta_k##)

Edit:
I didn't notice you put a subscript k on x on the right side. So then you're saying it's equal for any arbitrary x and xk? Surely that won't be true.

Also I didn't notice it from Ray's view before; my proof is slightly different. Maybe it would be better with his approach.

$$Q_k(x_1, x_2, \ldots, x_k) = Q_m(x_1, x_2, \ldots, x_k, 0, 0 , \ldots 0)$$
No need to agonize over matrices; just write things out as ##Q_m = \sum_{i=1}^m \sum_{j=1}^m a_{ij} x_i x_j## and ##Q_k = \sum_{i=1}^k \sum_{j=1}^k a_{ij} x_i x_j##.
##Q_m = \sum_{i=1}^m \sum_{j=1}^m a_{ij} x_i x_j## and ##Q_k = \sum_{i=1}^k \sum_{j=1}^k a_{ij} x_i x_j##
I see this part.

Is the following true?
##Q_k = Q_m - \sum_{i=k+1}^m \sum_{j=k+1}^m a_{ij} x_i x_j##
If so, then since ##Q_m > 0##, how do we know this second term is not ##=> Q_m##?

Multiplication can only be defined on one side of that equation.
(x either has k components or m components; either way it can't multiply both ##\Delta_k^{mxm}## and ## \Delta_k##)

Edit:
I didn't notice you put a subscript k on x on the right side. So then you're saying it's equal for any arbitrary x and xk? Surely that won't be true.

Also I didn't notice it from Ray's view before; my proof is slightly different. Maybe it would be better with his approach.
Yes, I was saying it's equal for ##x_k## such that ##x_k## is the first ##k## elements of an arbitrary ##x##. For some reason I thought this was true?

I was thinking the matrix we were talking about is just the ##mxm## identity matrix except with zeros for the last (m-k) columns and rows, but this doesn't seem to work.