# Understanding: Eigenvalues & Eigenvectors/Diagonalizing

1. Jul 23, 2015

### ahmed markhoos

Hello,

I'm having problem understanding this particular part, don't know it seems too dry and behind my capabilities of imagining the problems!, in the same time I feel like there is too many gaps in the way that the book explain the subject.

I'm using "Mathematical methods in the physical sciences by mary boas"

is there any useful references or youtube lectures you can suggest for me?

2. Jul 23, 2015

### blue_leaf77

We can't help unless you try to dive in the problem and tell us which part of the chapter you can't grasp. For prelim, do you know what matrix is and what operations exist among matrices?

3. Jul 23, 2015

### ahmed markhoos

you know it's in the end a methods book not a pure mathematical book, the problem is that I think somehow that there is a messing details in the section I'm reading "which is on eigenvalues & eigenvectors; diagnolizing matrices "

Ok, the same book I mentioned chapter 3 section 11. And yes I know what is matrix and what is operations.

4. Jul 23, 2015

### Fredrik

Staff Emeritus
5. Jul 23, 2015

### HallsofIvy

Staff Emeritus
What do you know about eigenvectors and eigenvalues? The basic definition of 'eigenvalue' and 'eigenvector' is that $\vec{v}$ is an eigenvector of linear transformation A, corresponding to eigenvalue $\lambda$ if and only if $A\vec{v}= \lambda\vec{v}$ so that, in its simplest sense, A simply acts like multiplication by $\lambda$ when applied to $\vec{v}$. Of course, just one vector acting like that wouldn't be very usefulTo but it is easy to prove "The set of all eigenvectors corresponding to eigenvalue $\lambda$ form a subspace: If $\vec{u}$ and $\vec{v}$ are both eigenvectors of A corresponding to the same eigenvalue, $\lambda$ then, for any numbers a and b, $a\vec{u}+ b\vec{v}$ is also an eigenvector.

An important result of that is: "If we can find a basis for the vector space consisting entirely of eigenvectors of A, then A, written as a matrix using that particular basis, is a diagonal matrix with its eigenvalues on the diagonal". To see that you need to recognize that if we apply any matrix, M, to the basis vectors of the vectors space, the result gives the columns of M. That is, if $Me_i= a_1e_1+ a_2e_2+ \cdot\cdot\cdot+ a_ne_n$ then the ith column of the matrix must be $\begin{bmatrix}a_1 \\ a_2 \\ \cdot\cdot\cdot \\ a_n\end{bmatrix}$. To see that recognize that $e_i$ would be written as a column with all "0"s except for a "1" in the ith place so that when we multiply by each row in the matrix M, we have only the number in the ith row of the column.

However, not every linear transformation has that property. That is, not every matrix can be "diagonalized". If all eigenvalues are different then the corresponding eigenvectors must be independent so there exist a basis of eigevectors. Even if there are not n different eigenvalues, eigenvectors corresponding to the same eigenvalue might be independent but not always.

Last edited: Aug 1, 2015
6. Jul 23, 2015

### Mayank Totloor

What is the practical importance of "Eigenvalues" and Eigenvectors" ? Can somebody please explain them clearly as to what they indicate and how they were invented.

7. Jul 24, 2015

### the_wolfman

Eigenvalues and eigenvectors represent the fundamental modes of a linear system.

It helps to consider some physics systems. For instance when studying the hydrogen atom in quantum mechanics there is a linear operator for the energy. The eigenvectors of this operator give you the electron orbitals, and the eigenvalue gives you the energy associated with a particular orbital.

In fluid dynamics you can derive sound waves from studying the properties of a linear operator. The eigenvectors of this operator give you information as to how the wave propagates, and the eigenvalues gives you the speed of sound.

8. Jul 28, 2015

### HallsofIvy

Staff Emeritus
If a linear transformation, from an n-dimensional vector space to itself, has a "complete set of eigenvectors", that is n independent eigenvectors, then using those eigenvectors as basis vectors the linear transformation can be written as a diagonal matrix with the eigenvalues on the main diagonal.

A diagonal matrix is particularly easy to work with. In particular a diagonal matrix is invertible if and only none of the numbers on its diagonal (its eigenvalues) are 0 and then its inverse matrix is the diagonal matrix with the reciprocals of the diagonal numbers of the original matrix on its diagonal.