What do you know about eigenvectors and eigenvalues? The basic definition of 'eigenvalue' and 'eigenvector' is that \vec{v} is an eigenvector of linear transformation A, corresponding to eigenvalue \lambda if and only if A\vec{v}= \lambda\vec{v} so that, in its simplest sense, A simply acts like multiplication by \lambda when applied to \vec{v}. Of course, just one vector acting like that wouldn't be very usefulTo but it is easy to prove "The set of all eigenvectors corresponding to eigenvalue \lambda form a subspace: If \vec{u} and \vec{v} are both eigenvectors of A corresponding to the same eigenvalue, \lambda then, for any numbers a and b, a\vec{u}+ b\vec{v} is also an eigenvector.
An important result of that is: "If we can find a basis for the vector space consisting entirely of eigenvectors of A, then A, written as a matrix using that particular basis, is a diagonal matrix with its eigenvalues on the diagonal". To see that you need to recognize that if we apply any matrix, M, to the basis vectors of the vectors space, the result gives the columns of M. That is, if Me_i= a_1e_1+ a_2e_2+ \cdot\cdot\cdot+ a_ne_n then the ith column of the matrix must be \begin{bmatrix}a_1 \\ a_2 \\ \cdot\cdot\cdot \\ a_n\end{bmatrix}. To see that recognize that e_i would be written as a column with all "0"s except for a "1" in the ith place so that when we multiply by each row in the matrix M, we have only the number in the ith row of the column.
However, not every linear transformation has that property. That is, not every matrix can be "diagonalized". If all eigenvalues are different then the corresponding eigenvectors must be independent so there exist a basis of eigevectors. Even if there are not n different eigenvalues, eigenvectors corresponding to the same eigenvalue might be independent but not always.