A matrix is diagonalizable when algebraic and geometric multiplicities are equal

  • Thread starter Thread starter jack_bauer
  • Start date Start date
  • Tags Tags
    Geometric Matrix
jack_bauer
Messages
10
Reaction score
0
A matrix is diagonalizable when algebraic and geometric multiplicities are equal.
I know this is true, and my professor proved it, but I did not understand him fully. Can someone please explain?
 
Physics news on Phys.org
This area is for "Learning Materials", not questions. I am moving this thread to "Linear and Abstract Algebra".
 
A matrix is diagonalizable if and only if there exist a "complete set" of eigenvectors. (Your "algebraic and geometric multiplicities are equal". The algebraic multiplicity is the size of the matrix, the geometric multiplicity is the number of independent eigenvectors.) Specifically, if the matrix represents a linear transformation on vector space U, then, in order to be "diagonalizable", there must exist a basis for U consisting of eigenvectors of the linear transformation. You construct the matrix representing a linear transformation, in a given basis, by applying the transformation to each basis vector in turn, writing the result as a linear combination of basis vectors. The coefficients give each column of the matrix.

If all the basis vectors \{v_1, v_2, \cdot\cdot\cdot\, v_n\} are eigenvectors, that is, if Lv_1= \lambda_1v_1, Lv_2= \lamba_2v_2, \cdot\cdot\cdot, Lv_n= \lambda_nv_n, then each column consists of the eigenvalue, in the appropriate position, and "0"s:
\begin{bmatrix}\lambda_1 & 0 & \cdot & \cdot & \cdot & 0 \\ 0 & \lambda_2 & \cdot & \cdot\ & \cdot & 0 \\ \cdot & \cdot\ & \cdot\ & \cdot & \cdot & \cdot \\ 0 & 0 & \cdot & \cdot & \cdot & \lambda_n\end{bmatrix}
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top