Eigenvalues and eigenvectors

In summary, eigenvalues and eigenvectors are extremely important in applied mathematics, quantum mechanics, and functional analysis. They are important because they allow for easy arithmetic and reveal a deep connection between vector spaces and polynomials.
  • #1
Yankel
395
0
Hello all

I have a theoretical question. I know how to find the eigenvalues and eigenvectors of a matrix A. What I am not sure about, is what it all means and why do we need it for. I did some reading, and saw something about stretching vector, if I not mistaken, if I have a vector v, and I multiply it by the matrix A, then it gets stretched by lambda.

I wanted to ask, if this is all it means, or it has some other meaning / use in algebra ? I am looking for an intuitive understanding of this topic. I also know it is being used in Statistics, for principal component analysis, but again, I don't understand how stretching vectors is related to this.

I also know that using eigenvalues and eigenvectors, you can find a diagonal matrix which is similar to A (am I right?). Why would we need that for ?

Can you please help me understand what eigenvalues and eigenvectors are ?
 
Physics news on Phys.org
  • #2
Eigenvalues and eigenvectors are of supreme importance in applied mathematics. Indeed, you can almost say that all of applied mathematics is essentially reducible to this problem (in the more abstract setting, we say it's operator theory on Hilbert Space, in particular spectral analysis).

They are extremely important in quantum mechanics, which was probably the motivating field for thinking about them in the first place. Certainly the extension of eigenvalues and eigenvectors, namely, functional analysis, was motivated by quantum mechanics.

Intuitively, an eigenvector $\mathbf{x}$ of a matrix $A$ is a non-zero vector whose direction is left unchanged when $A$ acts on it, except possibly for a reversal. A matrix $A$ can be thought of as an operator - it does something to a vector. So when we write $A\mathbf{x}=\lambda\mathbf{x}$, where $\mathbf{x}\not=0$, on the LHS, the matrix is acting on $\mathbf{x}$, and on the RHS, we are simply stretching, shrinking, and/or reversing $\mathbf{x}$.

Here's a terrific, very practical use of eigenvalues and eigenvectors. Suppose we have a system of differential equations
$$\dot{\mathbf{x}}=A\mathbf{x}.$$
The "formal solution" to this DE system is
$$\mathbf{x}=e^{tA}\mathbf{x}_0.$$
But what is this thing $e^{tA}$? What does it mean to exponentiate an operator? Well, think about the series expansion of $e^{x}$:
$$e^x=1+\frac{x}{1!}+\frac{x^2}{2!}+\frac{x^3}{3!}+\dots$$
So it makes sense to define
$$e^{tA}=I+\frac{tA}{1!}+\frac{t^2A^2}{2!}+\frac{t^3A^3}{3!}+\dots$$
But how can we easily compute $A^n$? Well, it turns out, if we can diagonalize $A$ (a process which involves the eigenvalues and eigenvectors of $A$), then we can write $A=PDP^{-1}$, where $P$ is (obviously) invertible, and $D$ is a diagonal matrix. Computing $A^n$ then becomes
$$A^n=(PDP^{-1})^n=\underbrace{(PDP^{-1})(PDP^{-1})\dots(PDP^{-1})}_{n \; \text{times}}=PD^nP^{-1},$$
and it's easy to exponentiate a diagonal matrix: just exponentiate the elements on the diagonal. Finding $P$ and $D$ is a matter of finding the eigenvalues and eigenvectors of $A$.

The bottom line: a system of linear ODE's with constant coefficients has essentially been reduced to the problem of finding the eigenvalues and eigenvectors of a matrix - if the diagonalization is possible (it isn't, always).
 
  • #3
Good answer, thank you !
 
  • #4
Here's one of the things about vector spaces-they have an *algebra* (called, appropriately enough, Linear Algebra), but they also have an *arithmetic*, which is an extension of our normal arithmetic, using matrices and $n$-tuples.

To pass between the two, we need something called a basis. This let's us use *numbers* as matrix entries, or in the $n$-tuples. But vectors aren't born with a basis, they are things we *impose* upon a vector space (we *choose* them). And there are a LOT of choices.

If we could make a basis of eigenvectors, for a given linear transformation the matrix of that transformation in that particular basis would be diagonal, which makes for "easy arithmetic". And as luck would have it, the eigenvalues of a matrix don't depend on the basis used.

There is another reason eigenvalues and eigenvectors are important: they reveal a deep connection between vector spaces (more specifically, the linear mappings between them), and polynomials. This is part of the beauty of mathematics-often two seemingly unrelated structures turn out to be long-lost cousins.
 

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are mathematical concepts used to describe the behavior of a linear transformation. Eigenvalues represent the scalar values that scale eigenvectors in a linear transformation.

2. What is the significance of eigenvalues and eigenvectors in linear algebra?

Eigenvalues and eigenvectors are important because they provide information about the properties of a linear transformation, such as its direction and magnitude. They are also used to solve systems of linear equations and to diagonalize matrices.

3. How do you calculate eigenvalues and eigenvectors?

Eigenvalues can be calculated by solving the characteristic equation det(A-λI)=0, where A is the matrix and λ is the eigenvalue. Eigenvectors can then be found by solving the system of equations (A-λI)x=0, where x is the eigenvector.

4. What is the physical interpretation of eigenvectors?

The physical interpretation of eigenvectors can vary depending on the application, but generally they represent the direction and magnitude of a transformation. In physics, they can represent the principal axes of a rigid body or the modes of vibration in a system.

5. Can a matrix have complex eigenvalues and eigenvectors?

Yes, a matrix can have complex eigenvalues and eigenvectors. This often occurs when dealing with higher dimensional systems or when the matrix has complex coefficients. However, the eigenvectors will still represent the direction and magnitude of the transformation, even if they are complex numbers.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
811
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
938
Replies
4
Views
2K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
899
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top