How can I find a basis for the span of some eigenvectors?

ZachKaiser
Messages
2
Reaction score
0
Hello all. This is my first post here. Hope someone can help. Thank you guys in advance.

Here is the question:

I have a n-by-n matrix A, whose eigenvalues are all real, distinct. And the matrix is positive semi-definite. It has linearly independent eigenvectors V_1...V_n. Now I have known part of them, let's say V_1...V_m. How can I get a basis for span{V_(m+1)...V_n} without calculating V_(m+1)...V_n (because n may be large and calculating all the eigenvectors is unfeasible)?

To better illustrate the question, here is a working example. Let's say

A=[1 1 -1;
0 2 1;
0 0 3;]

whose eigenvalues and eigenvectors are:
lamda_1=1, V_1=[1 0 0]'
lamda_2=2, V_2=[1 1 0]'
lamda_3=3, V_3=[0 1 1]'

If I only know lamda_1 and V_1 now, how can I get a basis for span{V_2,V_3} without calculating V_2 and V_3?

Thanks again and I appreciate your help!


Zach
 
Physics news on Phys.org
Assuming that you know that the matrix has three independent eigenvectors, two of them lie in the space orthogonal to the one you have. Here, your v1 is [1, 0, 0], then the "orthogonal complement" consists of all [x, y, z] such that [x, y, z][1, 0, 0]= x= 0. That is [0, y, z].
 
Thank you HallsofIvy. But unfortunately, the eigenvectors are not necessarily orthogonal (orthogonal only for symmetric matrices). So your idea seems not correct.

In the example I gave earlier, if you just find a basis for the orthogonal complement of V1, e.g. [0 1 0] and [0 0 1], they are not the basis for span{V2,V3}. Simply because span{[0 1 0],[0 0 1]} is not an invariant subspace for A, the "orthogonal" property in one subspace is not preserved after you multiply it by A.

Thanks anyway
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top