Hermitian, positive definite matrices

Ray Vickson
Homework Helper
Dearly Missed
##Q_m = \sum_{i=1}^m \sum_{j=1}^m a_{ij} x_i x_j## and ##Q_k = \sum_{i=1}^k \sum_{j=1}^k a_{ij} x_i x_j##
I see this part.

Is the following true?
##Q_k = Q_m - \sum_{i=k+1}^m \sum_{j=k+1}^m a_{ij} x_i x_j##
If so, then since ##Q_m > 0##, how do we know this second term is not ##=> Q_m##?
You are having trouble here because either you only read the (last) half of what I wrote, or else have forgotten the first half when you read the second half. That is absolutely my last word on this topic.

pyroknife
You are having trouble here because either you only read the (last) half of what I wrote, or else have forgotten the first half when you read the second half. That is absolutely my last word on this topic.
Ah I get it. I was misreading the first part of that post. I had thought you were also adding in zeros into the last columns and rows of the A matrix for the ##Q_m## computation.

Nathanael
Homework Helper
Yes, I was saying it's equal for ##x_k## such that ##x_k## is the first ##k## elements of an arbitrary ##x##. For some reason I thought this was true?
Yes if you define x_k that way then it is true.
Because if so, then we have already proved ##\Delta_k ## is positive definite
No, that only proves the first part of post #20, you still need to prove that ##\Delta_k^{mxm}## is positive definite, right?
The proof to this latter part is what I've been trying to hint. It goes like this:

##(\vec y)^TA(\vec y) >0 \implies (J_k\vec x)^TA(J_k\vec x) = \vec x^T(J_k^TAJ_k)\vec x > 0##

If you can find a matrix Jk such that ##J_k^TAJ_k = \Delta_k^{mxm}## then that would prove the positive definiteness of ##\Delta_k^{mxm}## which would complete the proof. This matrix Jk is what I've been waiting for you to realize (it's the one I've been talking about which is similar to the identity).

pyroknife
Yes if you define x_k that way then it is true.

But that only proves the first part of post #20, you still need to prove that ##\Delta_k^{mxm}## is positive definite, right?
The proof to this latter part is what I've been trying to hint. It goes like this:

##(\vec y)^TA(\vec y) >0 \implies (J_k\vec x)^TA(J_k\vec x) = \vec x^T(J_k^TAJ_k)\vec x > 0##

If you can find a matrix Jk such that ##J_k^TAJ_k = \Delta_k^{mxm}## then that would prove the positive definiteness of ##\Delta_k^{mxm}## which would complete the proof. This matrix Jk is what I've been waiting for you to realize (it's the one I've been talking about which is similar to the identity).
OHHH. So earlier when you said a matrix that multiplies the matrix A to yield ##\Delta_k^{mxm}##, I thought you meant something like this: ##B*A = \Delta_k^{mxm}##. But now I get it; I hadn't realized that we needed to multiply to both the left and right size of A. Actually, this is basically just what Ray Vickson had said, but in matrix notation.

But I get it now, thanks everyone for all the help.

Nathanael
Homework Helper
OHHH. So earlier when you said a matrix that multiplies the matrix A to yield ##\Delta_k^{mxm}##, I thought you meant something like this: ##B*A = \Delta_k^{mxm}##.
Well I kind of did mean that, because as it turns out ##J_kA = J_k^TAJ_k= \Delta_k^{mxm}##

But I get it now, thanks everyone for all the help.
But wait! There's one last step in my proof, which is to find a Jk such that ##J_k^TAJ_k= \Delta_k^{mxm}##

I'll just say it; Jk is the mxm matrix with the kxk identity matrix in the top left and zeros everywhere else. The transpose of Jk is still Jk, and multiplying A by Jk on either the left or right (or both) gives ##\Delta_k^{mxm}##.

pyroknife
Well I kind of did mean that, because as it turns out ##J_kA = J_k^TAJ_k= \Delta_k^{mxm}##

But wait! There's one last step in my proof, which is to find a Jk such that ##J_k^TAJ_k= \Delta_k^{mxm}##

I'll just say it; Jk is the mxm matrix with the kxk identity matrix in the top left and zeros everywhere else. The transpose of Jk is still Jk, and multiplying A by Jk on either the left or right (or both) gives ##\Delta_k^{mxm}##.
Hmmm, I said this earlier, but I tried it and it didn't work. Let me write it out.

Let's just assume ##A =
\begin{bmatrix}
1 & 1 & 1\\
1& 1 & 1\\
1& 1 & 1\\
\end{bmatrix}##

so ##\Delta_k^{mxm} =
\begin{bmatrix}
1 & 1 & 0\\
1& 1 & 0\\
0& 0 & 0\\
\end{bmatrix}##

But
##\begin{bmatrix}
1 & 0 & 0\\
0& 1 & 0\\
0& 0 & 0\\
\end{bmatrix}
\begin{bmatrix}
1 & 1 & 1\\
1& 1 & 1\\
1& 1 & 1\\
\end{bmatrix}
\neq \Delta_k^{mxm}##

But if we multiply this matrix on both sides, then we get ##\Delta_k^{mxm}##. So I don't think you can just multiply on one side?
So ##J_k A \neq J_k^T A J_k##

Nathanael
Nathanael
Homework Helper
Hmmm, I said this earlier, but I tried it and it didn't work. Let me write it out.
...
But if we multiply this matrix on both sides, then we get ##\Delta_k^{mxm}##. So I don't think you can just multiply on one side?
So ##J_k A \neq J_k^T A J_k##
Oh darn! You're right! Multiplying on the left JkA says "get rid of the last m-k rows" and multiplying on the right AJk says "get rid of the last m-k columns." We need to do both in order to get ##\Delta_k^{mxm}##!! Sorry for any confusion, I was looking at that wrongly.

Luckily it doesn't change the proof.

pyroknife