Can a theorem simplify finding eigenvalues of a block matrix?

goulio
Messages
14
Reaction score
0
I need to find the eigenvalues and eigenvectors of a matrix of the form
<br /> \left ( \begin{array}{cc}<br /> X_1 &amp; X_2 \\<br /> X_2 &amp; X_1<br /> \end{array} \right )<br />
where the X_i's are themselves M \times M matrices of the form
<br /> X_i = x_i \left ( \begin{array}{cccc}<br /> 1 &amp; 1 &amp; \cdots &amp; 1 \\<br /> 1 &amp; 1 &amp; \cdots &amp; 1 \\<br /> \vdots &amp; \vdots &amp; \ddots &amp; \vdots \\<br /> 1 &amp; 1 &amp; \cdots &amp; 1 <br /> \end{array} \right )<br />
Is there any theroem that could help? Something like if you find the eigenvalues of the X_i's then the eigenvalues of the block-matrix are...

Thanks
 
Physics news on Phys.org
Yours are not "circulant" matrices, but they are sort of similar. Maybe you will get some ideas by learning about circulant matrices on Wikipedia (and it never hurts to learn a little more about matrices):

http://en.wikipedia.org/wiki/Circulant_matrix

Carl
 
yeah ther's a theorem...it was part of my dynamical systems course, actually you shouhld learn it in a ODE class. sorry but i don't ahve my text near by
its something to do with nilpotent if i recall correctly.
 
I found out that the matrix can be rewritten as
<br /> \left ( \begin{array}{cc}x_1 &amp; x_2 \\<br /> x_2 &amp; x_1<br /> \end{array} \right ) \otimes <br /> \left ( \begin{array}{cccc}1 &amp; 1 &amp; \cdots &amp; 1 \\<br /> 1 &amp; 1 &amp; \cdots &amp; 1 \\ <br /> \vdots &amp; \vdots &amp; \ddots &amp; \vdots \\<br /> 1 &amp; 1 &amp; \cdots &amp; 1 \end{array} \right )<br />
So I now need to prove that the determinant of matrix filled with ones minus \lambda I is
<br /> (-1)^M \lambda^{M-1}(\lambda - M)<br />
Any ideas?
 
You cna find that determinant quite easily using row reductions, though since you only want to find eigenvectors and eigenvalues (and that is simple in this case) that is unnecessary
 
I tried evaluating the eigenvectors of the matrix filled with ones for M=6 in mathematica and here's what I get :

{1, 1, 1, 1, 1, 1},
{-1, 0, 0, 0, 0, 1},
{-1, 0, 0, 0, 1, 0},
{-1, 0, 0, 1,0, 0},
{-1, 0, 1, 0, 0, 0},
{-1, 1, 0, 0, 0, 0}}

The first one corresponds t0 \lambda = M eigenvalue and the others to \lambda = 0, but they're not orthogonal with each other they are only orthogonal with the first one! I know I could try to do linear combinations of those vectors but in the case where M is very large this becomes a bit confusing...

Any ideas?
 
Why do you want something to be orthogonal to something else, who has even said that we're working with a vector space over a field with a nondegenerate inner product?

By inspection the matrix has either 0 1 or 2 dimensional image. 0 if a=b=0, 1 if a=b, two otherwise (row reduce).

In any case you can use what you just did to work out the eigenvectors not in the kernel and the eigenvectors that are killed.

Hint, split a vector with 2M entries in half: (1,-1,0,...,0)

is certainly klled by thematrix, as is (0,..,0,1,-1,0,..0) where there are M zeroes before the 1
 
Last edited:
I think the answer is here:
http://cellular.ci.ulsa.mx/comun/algebra/node65.html
Basically,
for some $PX = \lambda X$ and $QY = \mu Y$ the following holds:

$$(P \otimes Q)(X \otimes Y) = (\lambda\mu)(X \otimes Y)$$.

This I guess implies that $(X \otimes Y)$ are the eigenvectors of $(P \otimes Q)$ (by definition) and $(\lambda\mu)$ are its eigenvalues. $M$ should be the only non-zero eigenvalue of $M \times M$ matrix of all ones (by Gershgorin theorem). Eigenvalues of the small matrix consisting of $x$s can be found in closed form solving associated quadratic.
 
Last edited by a moderator:
Back
Top