Why Are Eigenvalues and Eigenvectors Important in Linear Algebra?

Click For Summary

Discussion Overview

The discussion revolves around the significance of eigenvalues and eigenvectors in linear algebra, exploring their theoretical implications, applications in various fields, and intuitive understanding. Participants express curiosity about the meaning and utility of these concepts, particularly in relation to matrix operations and their role in solving differential equations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant seeks an intuitive understanding of eigenvalues and eigenvectors, questioning their meaning beyond the concept of stretching vectors.
  • Another participant emphasizes the foundational role of eigenvalues and eigenvectors in applied mathematics, particularly in operator theory and spectral analysis.
  • It is noted that eigenvalues and eigenvectors are crucial in quantum mechanics, with their development being motivated by this field.
  • A technical explanation is provided regarding how eigenvectors maintain their direction under matrix transformations, with the relationship expressed as \( A\mathbf{x}=\lambda\mathbf{x} \).
  • A practical application is discussed involving the solution of systems of differential equations, highlighting how diagonalization simplifies the computation of matrix exponentiation.
  • One participant mentions the importance of having a basis of eigenvectors to facilitate easier arithmetic in linear transformations.
  • Another point raised is the connection between vector spaces and polynomials revealed by eigenvalues and eigenvectors, suggesting a deeper mathematical relationship.

Areas of Agreement / Disagreement

Participants express varying degrees of understanding and curiosity about the topic, with some agreeing on the importance of eigenvalues and eigenvectors while others seek clarification on their implications and applications. The discussion remains unresolved regarding the intuitive understanding of these concepts.

Contextual Notes

Participants mention various applications and theoretical aspects of eigenvalues and eigenvectors, but there are unresolved questions about their broader implications and the conditions under which diagonalization is possible.

Yankel
Messages
390
Reaction score
0
Hello all

I have a theoretical question. I know how to find the eigenvalues and eigenvectors of a matrix A. What I am not sure about, is what it all means and why do we need it for. I did some reading, and saw something about stretching vector, if I not mistaken, if I have a vector v, and I multiply it by the matrix A, then it gets stretched by lambda.

I wanted to ask, if this is all it means, or it has some other meaning / use in algebra ? I am looking for an intuitive understanding of this topic. I also know it is being used in Statistics, for principal component analysis, but again, I don't understand how stretching vectors is related to this.

I also know that using eigenvalues and eigenvectors, you can find a diagonal matrix which is similar to A (am I right?). Why would we need that for ?

Can you please help me understand what eigenvalues and eigenvectors are ?
 
Physics news on Phys.org
Eigenvalues and eigenvectors are of supreme importance in applied mathematics. Indeed, you can almost say that all of applied mathematics is essentially reducible to this problem (in the more abstract setting, we say it's operator theory on Hilbert Space, in particular spectral analysis).

They are extremely important in quantum mechanics, which was probably the motivating field for thinking about them in the first place. Certainly the extension of eigenvalues and eigenvectors, namely, functional analysis, was motivated by quantum mechanics.

Intuitively, an eigenvector $\mathbf{x}$ of a matrix $A$ is a non-zero vector whose direction is left unchanged when $A$ acts on it, except possibly for a reversal. A matrix $A$ can be thought of as an operator - it does something to a vector. So when we write $A\mathbf{x}=\lambda\mathbf{x}$, where $\mathbf{x}\not=0$, on the LHS, the matrix is acting on $\mathbf{x}$, and on the RHS, we are simply stretching, shrinking, and/or reversing $\mathbf{x}$.

Here's a terrific, very practical use of eigenvalues and eigenvectors. Suppose we have a system of differential equations
$$\dot{\mathbf{x}}=A\mathbf{x}.$$
The "formal solution" to this DE system is
$$\mathbf{x}=e^{tA}\mathbf{x}_0.$$
But what is this thing $e^{tA}$? What does it mean to exponentiate an operator? Well, think about the series expansion of $e^{x}$:
$$e^x=1+\frac{x}{1!}+\frac{x^2}{2!}+\frac{x^3}{3!}+\dots$$
So it makes sense to define
$$e^{tA}=I+\frac{tA}{1!}+\frac{t^2A^2}{2!}+\frac{t^3A^3}{3!}+\dots$$
But how can we easily compute $A^n$? Well, it turns out, if we can diagonalize $A$ (a process which involves the eigenvalues and eigenvectors of $A$), then we can write $A=PDP^{-1}$, where $P$ is (obviously) invertible, and $D$ is a diagonal matrix. Computing $A^n$ then becomes
$$A^n=(PDP^{-1})^n=\underbrace{(PDP^{-1})(PDP^{-1})\dots(PDP^{-1})}_{n \; \text{times}}=PD^nP^{-1},$$
and it's easy to exponentiate a diagonal matrix: just exponentiate the elements on the diagonal. Finding $P$ and $D$ is a matter of finding the eigenvalues and eigenvectors of $A$.

The bottom line: a system of linear ODE's with constant coefficients has essentially been reduced to the problem of finding the eigenvalues and eigenvectors of a matrix - if the diagonalization is possible (it isn't, always).
 
Good answer, thank you !
 
Here's one of the things about vector spaces-they have an *algebra* (called, appropriately enough, Linear Algebra), but they also have an *arithmetic*, which is an extension of our normal arithmetic, using matrices and $n$-tuples.

To pass between the two, we need something called a basis. This let's us use *numbers* as matrix entries, or in the $n$-tuples. But vectors aren't born with a basis, they are things we *impose* upon a vector space (we *choose* them). And there are a LOT of choices.

If we could make a basis of eigenvectors, for a given linear transformation the matrix of that transformation in that particular basis would be diagonal, which makes for "easy arithmetic". And as luck would have it, the eigenvalues of a matrix don't depend on the basis used.

There is another reason eigenvalues and eigenvectors are important: they reveal a deep connection between vector spaces (more specifically, the linear mappings between them), and polynomials. This is part of the beauty of mathematics-often two seemingly unrelated structures turn out to be long-lost cousins.
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 5 ·
Replies
5
Views
5K