Eigenvectors and using them in matrix algebra.

Zhiv
Messages
5
Reaction score
0
Hi.

Marix A=

|1 1 0 |
|0 2 0 |
|2 1-1 |

Has three eigenvectors [1,1,1]^T, [1,0,1]^T and [0,0,1]^T, By using this knowledge solve A^11.

Ok, solving A^11 is rather easy with any decent calculator, or even with pen , paper and some time, but how on Earth I'm supposed to benefit from thoose eigenvectors?

Thank you.
 
Physics news on Phys.org
Look up "diagonalization".

Let P be the 3x3 matrix whose columns consists of the eigenvectors of A. Then P^-1AP is a diagonal matrix (where the entries are the eigenvalues of the eigenvectors). If D is that diagonal matrix, we have P^-1AP = D <=> A = PDP^-1, so that A^m = P * D^m * P^-1 for natural m (the last step can be proven with induction). But calculating D^m is easy, just raise each non-zero entry to the power of m. Then it's just a matter of working out what P^-1 is, and then multiplying the matrices.
 
Last edited:


Eigenvectors are a powerful tool in matrix algebra because they allow us to simplify complex matrix operations. In this case, we can use the eigenvectors to easily calculate A^11.

First, we need to find the eigenvalues corresponding to each eigenvector. We can do this by solving the characteristic equation det(A-λI)=0. In this case, we get eigenvalues of λ=1,2,-1.

Next, we can use the diagonalization theorem to write A as A=PDP^-1, where P is a matrix with the eigenvectors as its columns and D is a diagonal matrix with the eigenvalues on the diagonal.

So, we have A=[1,1,0; 1,0,1; 0,0,1] [1,0,0; 0,2,0; 0,0,-1] [1,-1,0; -1,2,0; 0,0,1].

Now, we can easily calculate A^11 by simply raising the diagonal matrix D to the 11th power, which gives us [1,0,0; 0,2048,0; 0,0,-1].

Finally, we can use P and P^-1 to transform the result back to the original basis, giving us A^11=[2048,2048,0; 0,4096,0; 4096,2048,-1].

As you can see, by using eigenvectors and diagonalization, we were able to simplify the calculation of A^11. This is just one example of how eigenvectors can be used in matrix algebra to make complex operations more manageable. They also have many other applications in fields such as engineering, physics, and computer science. So, while it may seem like a simple calculation in this case, eigenvectors are a valuable tool to have in your toolbox when working with matrices.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top