# Homework Help: Block matrix eigenvalues

1. Dec 19, 2005

### goulio

I need to find the eigenvalues and eigenvectors of a matrix of the form
$$\left ( \begin{array}{cc} X_1 & X_2 \\ X_2 & X_1 \end{array} \right )$$
where the $X_i$'s are themselves $M \times M$ matrices of the form
$$X_i = x_i \left ( \begin{array}{cccc} 1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{array} \right )$$
Is there any theroem that could help? Something like if you find the eigenvalues of the $X_i$'s then the eigenvalues of the block-matrix are...

Thanks

2. Dec 19, 2005

### CarlB

Yours are not "circulant" matrices, but they are sort of similar. Maybe you will get some ideas by learning about circulant matrices on Wikipedia (and it never hurts to learn a little more about matrices):

http://en.wikipedia.org/wiki/Circulant_matrix

Carl

3. Dec 19, 2005

### neurocomp2003

yeah ther's a theorem...it was part of my dynamical systems course, actually you shouhld learn it in a ODE class. sorry but i don't ahve my text near by
its something to do with nilpotent if i recall correctly.

4. Dec 19, 2005

### goulio

I found out that the matrix can be rewritten as
$$\left ( \begin{array}{cc}x_1 & x_2 \\ x_2 & x_1 \end{array} \right ) \otimes \left ( \begin{array}{cccc}1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{array} \right )$$
So I now need to prove that the determinant of matrix filled with ones minus $\lambda I$ is
$$(-1)^M \lambda^{M-1}(\lambda - M)$$
Any ideas?

5. Dec 20, 2005

### matt grime

You cna find that determinant quite easily using row reductions, though since you only want to find eigenvectors and eigenvalues (and that is simple in this case) that is unnecessary

6. Dec 20, 2005

### goulio

I tried evaluating the eigenvectors of the matrix filled with ones for M=6 in mathematica and here's what I get :

{1, 1, 1, 1, 1, 1},
{-1, 0, 0, 0, 0, 1},
{-1, 0, 0, 0, 1, 0},
{-1, 0, 0, 1,0, 0},
{-1, 0, 1, 0, 0, 0},
{-1, 1, 0, 0, 0, 0}}

The first one corresponds t0 $\lambda = M$ eigenvalue and the others to $\lambda = 0$, but they're not orthogonal with each other they are only orthogonal with the first one!!! I know I could try to do linear combinations of those vectors but in the case where M is very large this becomes a bit confusing...

Any ideas?

7. Dec 20, 2005

### matt grime

Why do you want something to be orthogonal to something else, who has even said that we're working with a vector space over a field with a nondegenerate inner product?

By inspection the matrix has either 0 1 or 2 dimensional image. 0 if a=b=0, 1 if a=b, two otherwise (row reduce).

In any case you can use what you just did to work out the eigenvectors not in the kernel and the eigenvectors that are killed.

Hint, split a vector with 2M entries in half: (1,-1,0,....,0)

is certainly klled by thematrix, as is (0,..,0,1,-1,0,..0) where there are M zeroes before the 1

Last edited: Dec 20, 2005
8. Dec 11, 2008

### boreshkin

I think the answer is here:
http://cellular.ci.ulsa.mx/comun/algebra/node65.html [Broken]
Basically,
for some $PX = \lambda X$ and $QY = \mu Y$ the following holds:

$$(P \otimes Q)(X \otimes Y) = (\lambda\mu)(X \otimes Y)$$.

This I guess implies that $(X \otimes Y)$ are the eigenvectors of $(P \otimes Q)$ (by definition) and $(\lambda\mu)$ are its eigenvalues. $M$ should be the only non-zero eigenvalue of $M \times M$ matrix of all ones (by Gershgorin theorem). Eigenvalues of the small matrix consisting of $x$s can be found in closed form solving associated quadratic.

Last edited by a moderator: May 3, 2017