Can a theorem simplify finding eigenvalues of a block matrix?

Click For Summary

Homework Help Overview

The discussion revolves around finding the eigenvalues and eigenvectors of a specific block matrix composed of matrices of the form X_i, which are defined as scaled matrices filled with ones. Participants are exploring the potential application of theorems related to eigenvalues in this context.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the structure of the block matrix and its relation to circulant matrices. There are mentions of theorems from dynamical systems and ODE courses that may apply. Some participants are attempting to rewrite the matrix in a different form to facilitate finding the determinant and eigenvalues. Others are evaluating eigenvectors for specific cases and questioning the orthogonality of the results.

Discussion Status

The discussion is active, with various approaches being explored. Some participants have provided links to external resources that may contain relevant information. There is an ongoing examination of the properties of the matrices involved, and hints have been offered regarding the structure of eigenvectors and the determinant calculation.

Contextual Notes

Participants are working under the constraints of homework rules, which may limit the amount of direct assistance they can provide. There are also discussions about the dimensionality of the image of the matrix and the implications for eigenvalues and eigenvectors.

goulio
Messages
14
Reaction score
0
I need to find the eigenvalues and eigenvectors of a matrix of the form
[tex] \left ( \begin{array}{cc}<br /> X_1 & X_2 \\<br /> X_2 & X_1<br /> \end{array} \right )[/tex]
where the [itex]X_i[/itex]'s are themselves [itex]M \times M[/itex] matrices of the form
[tex] X_i = x_i \left ( \begin{array}{cccc}<br /> 1 & 1 & \cdots & 1 \\<br /> 1 & 1 & \cdots & 1 \\<br /> \vdots & \vdots & \ddots & \vdots \\<br /> 1 & 1 & \cdots & 1 <br /> \end{array} \right )[/tex]
Is there any theroem that could help? Something like if you find the eigenvalues of the [itex]X_i[/itex]'s then the eigenvalues of the block-matrix are...

Thanks
 
Physics news on Phys.org
Yours are not "circulant" matrices, but they are sort of similar. Maybe you will get some ideas by learning about circulant matrices on Wikipedia (and it never hurts to learn a little more about matrices):

http://en.wikipedia.org/wiki/Circulant_matrix

Carl
 
yeah ther's a theorem...it was part of my dynamical systems course, actually you shouhld learn it in a ODE class. sorry but i don't ahve my text near by
its something to do with nilpotent if i recall correctly.
 
I found out that the matrix can be rewritten as
[tex] \left ( \begin{array}{cc}x_1 & x_2 \\<br /> x_2 & x_1<br /> \end{array} \right ) \otimes <br /> \left ( \begin{array}{cccc}1 & 1 & \cdots & 1 \\<br /> 1 & 1 & \cdots & 1 \\ <br /> \vdots & \vdots & \ddots & \vdots \\<br /> 1 & 1 & \cdots & 1 \end{array} \right )[/tex]
So I now need to prove that the determinant of matrix filled with ones minus [itex]\lambda I[/itex] is
[tex] (-1)^M \lambda^{M-1}(\lambda - M)[/tex]
Any ideas?
 
You cna find that determinant quite easily using row reductions, though since you only want to find eigenvectors and eigenvalues (and that is simple in this case) that is unnecessary
 
I tried evaluating the eigenvectors of the matrix filled with ones for M=6 in mathematica and here's what I get :

{1, 1, 1, 1, 1, 1},
{-1, 0, 0, 0, 0, 1},
{-1, 0, 0, 0, 1, 0},
{-1, 0, 0, 1,0, 0},
{-1, 0, 1, 0, 0, 0},
{-1, 1, 0, 0, 0, 0}}

The first one corresponds t0 [itex]\lambda = M[/itex] eigenvalue and the others to [itex]\lambda = 0[/itex], but they're not orthogonal with each other they are only orthogonal with the first one! I know I could try to do linear combinations of those vectors but in the case where M is very large this becomes a bit confusing...

Any ideas?
 
Why do you want something to be orthogonal to something else, who has even said that we're working with a vector space over a field with a nondegenerate inner product?

By inspection the matrix has either 0 1 or 2 dimensional image. 0 if a=b=0, 1 if a=b, two otherwise (row reduce).

In any case you can use what you just did to work out the eigenvectors not in the kernel and the eigenvectors that are killed.

Hint, split a vector with 2M entries in half: (1,-1,0,...,0)

is certainly klled by thematrix, as is (0,..,0,1,-1,0,..0) where there are M zeroes before the 1
 
Last edited:
I think the answer is here:
http://cellular.ci.ulsa.mx/comun/algebra/node65.html
Basically,
for some $PX = \lambda X$ and $QY = \mu Y$ the following holds:

$$(P \otimes Q)(X \otimes Y) = (\lambda\mu)(X \otimes Y)$$.

This I guess implies that $(X \otimes Y)$ are the eigenvectors of $(P \otimes Q)$ (by definition) and $(\lambda\mu)$ are its eigenvalues. $M$ should be the only non-zero eigenvalue of $M \times M$ matrix of all ones (by Gershgorin theorem). Eigenvalues of the small matrix consisting of $x$s can be found in closed form solving associated quadratic.
 
Last edited by a moderator:

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
8
Views
3K
Replies
0
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K