What Are Eigenvectors and Eigenvalues? - Comments

• Insights
Mentor
"when a matrix A multiplies an eigenvector, the result is a vector in the same (or possibly opposite) direction"
This is from the basic definition of an eigenvector.

An eigenvector ##\vec{x}## for a matrix is a nonzero vector for which ##A\vec{x} = \lambda\vec{x}## for some scalar ##\lambda##. From this definition, it can be seen that ##A\vec{x}## results in a vector that is a multiple of ##\vec{x}##, hence it has the same direction or the opposite direction. Any time you multiply a vector by a scalar, the new vector is in the same direction or the opposite direction.

I recently stumbled across a great "intuition oriented" supplement to Mark44's Insight article. It has some nice animations that help visualize it from a geometric perspective.

Stephen Tashi
but the engineer in me cries out... "But what is it for?" :)
An unsophisticated indication:

Suppose a problem involves a given matrix ##M## and many different column vectors ##v_1,v_2,v_3,...## and that you must compute the products ##Mv_1, Mv_2,Mv_3,...##.

Further suppose you have your choice about what basis to use in representing the vectors and that ##M## has two eigenvectors ##e_1, e_2## with respective eigenvalues ##\lambda_1, \lambda_2##.

In the happy event that each vector ##v_1## can be represented as a linear combination of ##e_1,e_2##, you could do all the multiplications without actually multiplying a matrix times a vector in detail. For example, if ##v_1 = 3e_1 + 4e_2## then
##Mv_1 = 3Me_1 + 4Me_2 = 3\lambda_1 e_1 + 4 \lambda_2 e_2##.

The coordinate representation of that would be ##M \begin{pmatrix} 3 \\4 \end{pmatrix} = \begin{pmatrix} 3 \lambda_1 \\ 4\lambda_2 \end{pmatrix}## provided the coordinates represent the vectors ##v_i## in the basis ##{e_1,e_2}##.

Of course, you might say "But I'd have to change all the ##v_i## to be in the ##{e_1,e_2}## basis". However, in practical data collection, raw data is reduced to some final form. So you are at least you know that the "eigenbasis" would be a good format for the reduced data. Also, in theoretical reasoning, it is often simpler to imagine that a set of vectors is represented in the eigenbasis of a particular matrix.

A randomly chosen set of vectors can't necessarily be represented using the eigenvectors of a given matrix as a basis. However, there are many situations where the vectors involved in a physical situation can be represented using the eigenvectors of a matrix involved in that situation. Why Nature causes this to happen varies from case to case. When it does happen in Nature, we don't want to overlook it because it offers a great simplification.

ibkev
This is not *at all* a rigorous, well-thought definition, but...

I personally like to think of it this way: whenever a linear operator acts on some vector space, it transforms the vector subspaces inside, right? There might be some subspaces that aren't rotated or manipulated in any other direction, only scaled by some factor. Those invariant subspaces contain the eigenvectors, and the scaling factor is the corresponding eigenvalue. It's not hard to extend this general idea of "invariance" to the idea of, say, finding the allowed states of a system in quantum mechanics, especially when you remember that the Hamiltonian is a linear operator. Linear algebra in general is the art of taking algebraic, abstract concepts, and putting them into concrete matrices and systems of equations.

ibkev