Aligning a matrix with its eigen vectors and other questions?

Dr Bwts
Messages
17
Reaction score
0
Hi,

I have a square symmetrical matrix A (ugly I know)


321.1115, -57.5311, -33.9206
-57.5311, 296.7836, 10.8958
-33.9206, 10.8958, 382.1050

which has the eigen values,

248.8034
341.6551
409.5415

Am I right in saying that A when aligned with its eigen vectors it is,

248.8034, 0, 0
0, 341.6551, 0
0, 0, 409.5415

?

I would also like to transform the matrix so that,

A11+A22+A33 = 1

Thanks for any help, I feel like I should know this but have been running around in circles for the past 2 hours.
 
Physics news on Phys.org
"aligned with its eigenvectors" is not standard terminology, but I think what you mean is correct. Given any linear transformation, T, from a vector space of dimension n to itself, we can always represent the transformation as a vector space by using a specific ordered basis for the vector space. The idea is that we apply T to each of the basis vectors in turn, writing the result as a linear combination of the basis vectors. For example, if we have a three dimensional vector space with ordered basis \{v_1, v_2, v_3\} then the vector x_1v_2+ x_2v_2+ x_3v_3 would be represented by the array
\begin{bmatrix}x_1 \\ x_2 \\ x_3\end{bmatrix}
In particular, v_1= 1v_1+ 0v_2+ 0v_3 itself is represented by
\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}

So if T(v_1)= a_1v_1+ a_2v_2_+ a_3v_3 we can write
\begin{a_1 & * & * \\ a_2 & * & * \\ a_3 & * & * \end{bmatrix}\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}= \begin{bmatrix}a_1 \\ a_2\\ a_3\end{bmatrix}
where the "*" in the second and third columns can be anything.

That is, the result result of T applied to basis vector v_i gives the ith column of the matrix representation. In particular, if the basis vectors are eigenvalues of T, then Tv_1= \lamba_1 v_1+ 0 v_2+ 0v_3, Tv_2= 0v_1+ \lambda_2v_2+ 0v_3, and Tv_3= 0v_1+ 0v_2+ \lambda_3v_3 so the matrix representation, in that basis, is
\begin{bmatrix}\lambda_2 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3\end{bmatrix}

To reduce those diagonal elements to 1 is to divide each eigenvector by the corresponding eigenvalue. If v_1 is an eigenvector of T with eigenvalue \lambda_1 then
T\frac{v_1}{\lambda_1}= \frac{1}{\lambda_1}Tv_1= \frac{1}{\lambda_1}\lambda_1 v_1= v_1
so representing T as as matrix by using basis vectors v_1/\lambda_1, v_2/\lambda_2, and v_3/\lambda_3 will give the identity matrix.
 
Thanks for the reply.

Once the matrix (A) has been transformed as above how can I scale it such that,

trace(A)=1

?
 
Last edited:
OK panic over I just divide the aligned matrix by its trace.

Thanks for your time.
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
Back
Top