Actually, in order for a matrix to be diagonalizable, it is NOT necessary that all the eigenvalues be unique. It IS necessary that all the eigenvectors be independent- that is that there exist a basis for the vector space consisting of eigenvectors.
Essentially, the eigenvectors are those vectors on which the linear tranformation acts like simple scalar multiplication.
If you have some square matrix then a non-zero vector x in R^n is an eigenvector of A if Ax is a scalar multiple of x. This scalar is called an eigenvalue of A.
Q) So if you have some vector then scalar multiples of it only 'stretches' or 'compresses' it by a factor of your eigenvalue? Explain. And how can we use determinants in finding eigenvalues of a given matrix?
(off-topic) Has this got anything to do with the Kronecker Delta in Tensor Calculus?
an n by n matrix M is diagonalizable if and only if the space R^n has a basis of eigenvectors of M, if and only if the minimal polynomial P of M consists of a product of different linear factors, if and only if the characteristic polynomial Q splits into a product of linear factors, and for each root c of Q, the kernel of M-cId has dimension equal to the power with which the factor (X-c) occurs in Q.