- #1
Adel Makram
- 635
- 15
If there is matrix that is formed by blocks of 2 x 2 matrices, what will be the relation between the eigen values and vectors of that matrix and the eigen values and vectors of the sub-matrices?
This is clear if C is a diagonal matrix with entries are real numbers, in such case, the eigen vectors of C is ##(1,0,0...0)^T##, ...##(0,0,0...1)^T## and the eigen values are the numbers themselves. The eigen vectors and eigen values of each block will follow the same pattern.hilbert2 said:If the matrices ##\mathbf{A}## and ##\mathbf{B}## are 2x2 matrices, ##\mathbf{C} = \begin{bmatrix}\mathbf{A}\hspace{10pt}0\\0\hspace{10pt}\mathbf{B}\end{bmatrix}## is a block matrix formed from them, and ##\mathbf{v}=\begin{bmatrix}a\\b\\c\\d\end{bmatrix}## is an eigenvector of ##\mathbf{C}## with eigenvalue ##c##, then ##c## must also be an eigenvalue of both ##\mathbf{A}## and ##\mathbf{B}##, or at least an eigenvalue of one of them in the case where ##a=b=0## or ##c=d=0##.
I used online matrix calculator and I found no obvious correlation between the eigen vectors or eigen values of the ##\begin{bmatrix}\mathbf{A} & 0 \\ 0 & \mathbf{B}\end{bmatrix}## and ##\begin{bmatrix}0 & \mathbf{A} \\ \mathbf{B} & 0\end{bmatrix}##. There is of course obvious similarity as described above in the first matrix and its two blocks. However, this similarity is not there if it is antisymmetrical one. Also, antisymmetrical matrix should have the transpose equal to its negative by definition, so the second matrix should be called something else.hilbert2 said:It also holds in the case ##\mathbf{C}=\begin{bmatrix}1 & 2 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 0 & 0 & 2 & 3 \\ 0 & 0 & 3 & 2\end{bmatrix}## or some other block-diagonal matrix.
An eigenvector of ##\begin{bmatrix}\mathbf{A} & 0 \\ 0 & \mathbf{B}\end{bmatrix}## is also an eigenvector of ##\begin{bmatrix}0 & \mathbf{A} \\ \mathbf{B} & 0\end{bmatrix}##, if it is symmetric or antisymmetric in the interchange of the upper and lower two elements:
##\begin{bmatrix}a \\ b \\ c \\ d\end{bmatrix} = \pm\begin{bmatrix}c \\ d \\ a \\ b\end{bmatrix}##.
Adel Makram said:Again, will be any relationship between the eigen values of the matrix and its blocks matrices?
Thank you for comprehensive analysis. But should be any closed form for that function of the eigen value of A and the eigenvalues of B?StoneTemplePython said:You could say that the eigenvalues of XX\mathbf X are a function of the eigenvalues of AA\mathbf A and the eigenvalues of BB\mathbf B. However the eigenvalues of YY\mathbf Y are in some sense a function of the eigenvalues of (AB)(AB)\big(\mathbf {AB}\big) -- which of course are the same as those for (BA)(BA)\big(\mathbf {BA}\big)
Adel Makram said:Thank you for comprehensive analysis. But should be any closed form for that function of the eigen value of A and the eigenvalues of B?
For example, suppose X represents the transitional probability in Markov chain process, will I be able to represent it, using certain transformation, by a Markov transitions process at the level of 2 x 2 submatrices? Of course in that case the sum of probabilities in each row=1 because there will be only 2 entries in each rows and each columns so the probability completion condition is satisfied.
This is close to what I am thinking in. If X is an nxn matrix in certain vector space with entries representing some coefficients, it would be wonderful if we can reduce this representation to some form of 2x2 blocks of its states even if it is in some other space. For example, In Markov case, if X is the transitional probability matrix of 10 states representing some class, or space, of variables, will it be possible to transform it into block of 2x2 matrices of many 2 states in some other transformed space. In general, if we have a graph of many branches from any node in it, will it possible to transform it into another graph with only two nodal branching system in which there are just 2 branches from any node?StoneTemplePython said:You may need to re-ask this as I'm not totally sure what you're getting at. In your example, if you have a markov chain with 2 distinct non-communicating clasess of nodes, then you may be able to use ##\mathbf X##.
In general looking at the graph-theoretic implications of having blocked matrices with special structure (e.g. ##\mathbf T## would have ##\mathbf A## and ##\mathbf B## as recurrent classes and ##*## as transient states in a Markov chain) can be quite enlightening.
Adel Makram said:if we have a graph of many branches from any node in it, will it possible to transform it into another graph with only two nodal branching system in which there are just 2 branches from any node?
Eigenvalues of block matrices refer to the characteristic values of a square matrix that has been partitioned into submatrices, also known as blocks. These eigenvalues represent the scaling factors of the eigenvectors of the matrix.
The eigenvalues of block matrices can be calculated by finding the characteristic polynomial of the matrix, which is the determinant of the matrix minus a scalar multiple of the identity matrix. This polynomial can then be solved to find the eigenvalues.
The eigenvalues of block matrices are important in many areas of science, including physics, engineering, and computer science. They can be used to analyze the stability and behavior of systems, and to solve differential equations and other mathematical problems.
The eigenvalues of block matrices are related to the eigenvalues of the original matrix through a simple formula. If a matrix is partitioned into blocks, the eigenvalues of the original matrix will be the union of the eigenvalues of the individual block matrices.
If the blocks in a block matrix are changed, for example by rearranging their order or adding or removing blocks, the eigenvalues of the matrix will also change. However, the overall relationship between the eigenvalues of the original matrix and the eigenvalues of the individual blocks will remain the same.