Adel Makram said:
Again, will be any relationship between the eigen values of the matrix and its blocks matrices?
I'll try to add a bit to this by ignoring eigenvectors and just focusing on eigenvalues. I assume that ##\mathbf A## and ##\mathbf B## are in both ##n## x ##n## (i.e. ##n\geq 2## ) and in reals.
E.g. consider
##\mathbf X =
\begin{bmatrix}\mathbf{A} & \mathbf 0 \\ \mathbf 0 & \mathbf{B}\end{bmatrix}
##
notice that ##det\big(\mathbf X\big) =det\big(\mathbf A\big)det\big(\mathbf B\big)##
(for a proof and a lot more info on blocked multiplication of matrices, look at page 108 of
https://math.byu.edu/~klkuttle/Linearalgebra.pdf )
and
##\mathbf Y =
\begin{bmatrix}\mathbf 0 &\mathbf{A} \\ \mathbf{B} &\mathbf 0\end{bmatrix}
##
##\big \vert det\big(\mathbf Y\big)\big \vert =\big \vert det \big(\mathbf A\big)det\big(\mathbf B\big)\big \vert##
(I put an absolute value sign in there as I want to ignore the sign implications related to number of column swaps needed to "convert" ##\mathbf Y## to ##\mathbf X##).
sometimes its useful to just multiply these things out.
##\mathbf X^2 =
\begin{bmatrix}\mathbf{A}^2 & \mathbf 0 \\ \mathbf 0 & \mathbf{B}^2\end{bmatrix}
##
##\mathbf Y^2 =
\begin{bmatrix}\mathbf{AB} & \mathbf 0 \\ \mathbf 0 & \mathbf{BA}\end{bmatrix}
##
and as we'd expect ##det\big(\mathbf X^2\big) = det\big(\mathbf Y^2\big)##
You could say that the eigenvalues of ##\mathbf X## are a function of the eigenvalues of ##\mathbf A## and the eigenvalues of ##\mathbf B##. However the eigenvalues of ##\mathbf Y## are in some sense a function of the eigenvalues of ##\big(\mathbf {AB}\big)## -- which of course are the same as those for ##\big(\mathbf {BA}\big)##
- - - - - - -
We can say lot more if our submatrices have special structure. E.g. suppose ##\mathbf A## and ##\mathbf B## are both real symmetric. In this case we can say
##\mathbf X^2 = \mathbf X^T \mathbf X =
\begin{bmatrix}\mathbf{A}^T \mathbf A & \mathbf 0 \\ \mathbf 0 & \mathbf{B}^T\mathbf B \end{bmatrix}##
##\mathbf Y^T \mathbf Y = \begin{bmatrix}\mathbf{B}^T \mathbf B & \mathbf 0 \\ \mathbf 0 & \mathbf{A}^T\mathbf A \end{bmatrix}##
but this tells us that the sum of the eigenvalues of ##\mathbf X^2## is larger than the sum of those of ##\mathbf Y^2##, because ##trace\big(\mathbf X^2\big) = trace\big(\mathbf X^T\mathbf X\big) = trace\big(\mathbf Y^T\mathbf Y\big) \geq trace\big(\mathbf Y^2\big)##
(why?... also note if ##\mathbf A \neq \mathbf B## then the inequality is strict.)
It's also worth pointing out that we can be certain that ##\mathbf X## only has real eigenvalues, whereas ##\mathbf Y## may have complex eigenvalues coming in conjugate pairs. (Note: this is
still true if ##\mathbf A## and ##\mathbf B## are both Hermitian, with scalars in ##\mathbb C##.)
So ##\mathbf X## and ##\mathbf Y## are linked via their determinants, but the actual structure of their eigenvalues is going to be rather different.
- - - -
Stepping back a bit and thinking more generally, it's also worth looking at the block upper triangular matrix ##\mathbf T##
##\mathbf T =
\begin{bmatrix}\mathbf{A} & \mathbf * \\ \mathbf 0 & \mathbf{B}\end{bmatrix}
##
where ##\mathbf *## indicates entries we are not concerned with.
This matrix has its eigenvalues specified entirely by those of ##\mathbf A## and ##\mathbf B##. There are severals ways to verify this. Kuttler shows the typical way. My preferred approach is to just multiply it out and note that for any natural number ##k = 1, 2, 3,...##
##\mathbf T^k =
\begin{bmatrix}\mathbf{A}^k & \mathbf * \\ \mathbf 0 & \mathbf{B}^k\end{bmatrix}
##
hence
##trace\big(\mathbf T^k\big) = trace\big(\mathbf{A}^k + \mathbf{B}^k\big) = trace\big(\mathbf{A}^k\big) + trace\big(\mathbf{B}^k\big)##
for all natural numbers ##k##