MHB Why Are Eigenvalues and Eigenvectors Important in Linear Algebra?

Yankel
Messages
390
Reaction score
0
Hello all

I have a theoretical question. I know how to find the eigenvalues and eigenvectors of a matrix A. What I am not sure about, is what it all means and why do we need it for. I did some reading, and saw something about stretching vector, if I not mistaken, if I have a vector v, and I multiply it by the matrix A, then it gets stretched by lambda.

I wanted to ask, if this is all it means, or it has some other meaning / use in algebra ? I am looking for an intuitive understanding of this topic. I also know it is being used in Statistics, for principal component analysis, but again, I don't understand how stretching vectors is related to this.

I also know that using eigenvalues and eigenvectors, you can find a diagonal matrix which is similar to A (am I right?). Why would we need that for ?

Can you please help me understand what eigenvalues and eigenvectors are ?
 
Physics news on Phys.org
Eigenvalues and eigenvectors are of supreme importance in applied mathematics. Indeed, you can almost say that all of applied mathematics is essentially reducible to this problem (in the more abstract setting, we say it's operator theory on Hilbert Space, in particular spectral analysis).

They are extremely important in quantum mechanics, which was probably the motivating field for thinking about them in the first place. Certainly the extension of eigenvalues and eigenvectors, namely, functional analysis, was motivated by quantum mechanics.

Intuitively, an eigenvector $\mathbf{x}$ of a matrix $A$ is a non-zero vector whose direction is left unchanged when $A$ acts on it, except possibly for a reversal. A matrix $A$ can be thought of as an operator - it does something to a vector. So when we write $A\mathbf{x}=\lambda\mathbf{x}$, where $\mathbf{x}\not=0$, on the LHS, the matrix is acting on $\mathbf{x}$, and on the RHS, we are simply stretching, shrinking, and/or reversing $\mathbf{x}$.

Here's a terrific, very practical use of eigenvalues and eigenvectors. Suppose we have a system of differential equations
$$\dot{\mathbf{x}}=A\mathbf{x}.$$
The "formal solution" to this DE system is
$$\mathbf{x}=e^{tA}\mathbf{x}_0.$$
But what is this thing $e^{tA}$? What does it mean to exponentiate an operator? Well, think about the series expansion of $e^{x}$:
$$e^x=1+\frac{x}{1!}+\frac{x^2}{2!}+\frac{x^3}{3!}+\dots$$
So it makes sense to define
$$e^{tA}=I+\frac{tA}{1!}+\frac{t^2A^2}{2!}+\frac{t^3A^3}{3!}+\dots$$
But how can we easily compute $A^n$? Well, it turns out, if we can diagonalize $A$ (a process which involves the eigenvalues and eigenvectors of $A$), then we can write $A=PDP^{-1}$, where $P$ is (obviously) invertible, and $D$ is a diagonal matrix. Computing $A^n$ then becomes
$$A^n=(PDP^{-1})^n=\underbrace{(PDP^{-1})(PDP^{-1})\dots(PDP^{-1})}_{n \; \text{times}}=PD^nP^{-1},$$
and it's easy to exponentiate a diagonal matrix: just exponentiate the elements on the diagonal. Finding $P$ and $D$ is a matter of finding the eigenvalues and eigenvectors of $A$.

The bottom line: a system of linear ODE's with constant coefficients has essentially been reduced to the problem of finding the eigenvalues and eigenvectors of a matrix - if the diagonalization is possible (it isn't, always).
 
Good answer, thank you !
 
Here's one of the things about vector spaces-they have an *algebra* (called, appropriately enough, Linear Algebra), but they also have an *arithmetic*, which is an extension of our normal arithmetic, using matrices and $n$-tuples.

To pass between the two, we need something called a basis. This let's us use *numbers* as matrix entries, or in the $n$-tuples. But vectors aren't born with a basis, they are things we *impose* upon a vector space (we *choose* them). And there are a LOT of choices.

If we could make a basis of eigenvectors, for a given linear transformation the matrix of that transformation in that particular basis would be diagonal, which makes for "easy arithmetic". And as luck would have it, the eigenvalues of a matrix don't depend on the basis used.

There is another reason eigenvalues and eigenvectors are important: they reveal a deep connection between vector spaces (more specifically, the linear mappings between them), and polynomials. This is part of the beauty of mathematics-often two seemingly unrelated structures turn out to be long-lost cousins.
 
Thread 'Determine whether ##125## is a unit in ##\mathbb{Z_471}##'
This is the question, I understand the concept, in ##\mathbb{Z_n}## an element is a is a unit if and only if gcd( a,n) =1. My understanding of backwards substitution, ... i have using Euclidean algorithm, ##471 = 3⋅121 + 108## ##121 = 1⋅108 + 13## ##108 =8⋅13+4## ##13=3⋅4+1## ##4=4⋅1+0## using back-substitution, ##1=13-3⋅4## ##=(121-1⋅108)-3(108-8⋅13)## ... ##= 121-(471-3⋅121)-3⋅471+9⋅121+24⋅121-24(471-3⋅121## ##=121-471+3⋅121-3⋅471+9⋅121+24⋅121-24⋅471+72⋅121##...
Back
Top