MHB Why Are Eigenvalues and Eigenvectors Important in Linear Algebra?

Yankel
Messages
390
Reaction score
0
Hello all

I have a theoretical question. I know how to find the eigenvalues and eigenvectors of a matrix A. What I am not sure about, is what it all means and why do we need it for. I did some reading, and saw something about stretching vector, if I not mistaken, if I have a vector v, and I multiply it by the matrix A, then it gets stretched by lambda.

I wanted to ask, if this is all it means, or it has some other meaning / use in algebra ? I am looking for an intuitive understanding of this topic. I also know it is being used in Statistics, for principal component analysis, but again, I don't understand how stretching vectors is related to this.

I also know that using eigenvalues and eigenvectors, you can find a diagonal matrix which is similar to A (am I right?). Why would we need that for ?

Can you please help me understand what eigenvalues and eigenvectors are ?
 
Physics news on Phys.org
Eigenvalues and eigenvectors are of supreme importance in applied mathematics. Indeed, you can almost say that all of applied mathematics is essentially reducible to this problem (in the more abstract setting, we say it's operator theory on Hilbert Space, in particular spectral analysis).

They are extremely important in quantum mechanics, which was probably the motivating field for thinking about them in the first place. Certainly the extension of eigenvalues and eigenvectors, namely, functional analysis, was motivated by quantum mechanics.

Intuitively, an eigenvector $\mathbf{x}$ of a matrix $A$ is a non-zero vector whose direction is left unchanged when $A$ acts on it, except possibly for a reversal. A matrix $A$ can be thought of as an operator - it does something to a vector. So when we write $A\mathbf{x}=\lambda\mathbf{x}$, where $\mathbf{x}\not=0$, on the LHS, the matrix is acting on $\mathbf{x}$, and on the RHS, we are simply stretching, shrinking, and/or reversing $\mathbf{x}$.

Here's a terrific, very practical use of eigenvalues and eigenvectors. Suppose we have a system of differential equations
$$\dot{\mathbf{x}}=A\mathbf{x}.$$
The "formal solution" to this DE system is
$$\mathbf{x}=e^{tA}\mathbf{x}_0.$$
But what is this thing $e^{tA}$? What does it mean to exponentiate an operator? Well, think about the series expansion of $e^{x}$:
$$e^x=1+\frac{x}{1!}+\frac{x^2}{2!}+\frac{x^3}{3!}+\dots$$
So it makes sense to define
$$e^{tA}=I+\frac{tA}{1!}+\frac{t^2A^2}{2!}+\frac{t^3A^3}{3!}+\dots$$
But how can we easily compute $A^n$? Well, it turns out, if we can diagonalize $A$ (a process which involves the eigenvalues and eigenvectors of $A$), then we can write $A=PDP^{-1}$, where $P$ is (obviously) invertible, and $D$ is a diagonal matrix. Computing $A^n$ then becomes
$$A^n=(PDP^{-1})^n=\underbrace{(PDP^{-1})(PDP^{-1})\dots(PDP^{-1})}_{n \; \text{times}}=PD^nP^{-1},$$
and it's easy to exponentiate a diagonal matrix: just exponentiate the elements on the diagonal. Finding $P$ and $D$ is a matter of finding the eigenvalues and eigenvectors of $A$.

The bottom line: a system of linear ODE's with constant coefficients has essentially been reduced to the problem of finding the eigenvalues and eigenvectors of a matrix - if the diagonalization is possible (it isn't, always).
 
Good answer, thank you !
 
Here's one of the things about vector spaces-they have an *algebra* (called, appropriately enough, Linear Algebra), but they also have an *arithmetic*, which is an extension of our normal arithmetic, using matrices and $n$-tuples.

To pass between the two, we need something called a basis. This let's us use *numbers* as matrix entries, or in the $n$-tuples. But vectors aren't born with a basis, they are things we *impose* upon a vector space (we *choose* them). And there are a LOT of choices.

If we could make a basis of eigenvectors, for a given linear transformation the matrix of that transformation in that particular basis would be diagonal, which makes for "easy arithmetic". And as luck would have it, the eigenvalues of a matrix don't depend on the basis used.

There is another reason eigenvalues and eigenvectors are important: they reveal a deep connection between vector spaces (more specifically, the linear mappings between them), and polynomials. This is part of the beauty of mathematics-often two seemingly unrelated structures turn out to be long-lost cousins.
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top