I'm going through the book: Principles of Quantum Mechanics 2nd edition by R. Shankar
1.
In the mathematical introduction to projection operators (page 22), it writes:
"Consider the expansion of an arbitrary ket |V\rangle in a basis:
|V\rangle=\sum_{i}^{} |i\rangle\langle i|V\rangle"...
It would come from the fact that if the matrix is singular, then column one a multiple of the other (or one row is a multiple of the other). Maybe this isn't how it's supposed to be done though, so you might be better off waiting for somebody else's ideas than going off mine :/
I'm no linear algebra expert, but I would start by writing that if the matrix is written:
a b
c d
Then it's singular if a = x*b and c=x*d (or a=x*c and b=x*d), and work my way to the formula for the determinant.
Thanks, that helps a lot.
Does this makes sense as an answer?
Eigenvalues of A can be written:
(\lambda I -A)X=0
So the right side of the original equation is:
P(\lambda I -A)X=0
Move the P inside:
(\lambda P -PA)X=0
Multiply by I:
(\lambda P -PA)P^{-1}PX=0
Move P inverse...
The question is at the end of a chapter on spanning vector spaces.
Homework Statement
Let P denote an invertible n x n matrix.
If \lambda is a number, show that
E_{\lambda}(PAP^{-1}) = \left\{PX | X\;is\;in\;E_{\lambda}(A)\right\}
for each n x n matrix A. [Here E_{\lambda}(A)} is...