What's the utility of the eigenvectors of a matrix?

Click For Summary
SUMMARY

The utility of eigenvectors of a matrix is crucial in various fields, including quantum mechanics and engineering applications. Eigenvectors simplify matrix operations by allowing diagonalization, which transforms complex matrices into simpler forms. For instance, the matrix A = [0 1; -2 3] has eigenvalues 1 and 2, with corresponding eigenvectors [1 2] and [1 1]. This diagonalization facilitates solving equations and analyzing systems, such as in vibration analysis and stress analysis, where eigenvalues represent natural frequencies and principal stresses, respectively.

PREREQUISITES
  • Understanding of linear algebra concepts, particularly eigenvalues and eigenvectors.
  • Familiarity with matrix operations and diagonalization techniques.
  • Basic knowledge of quantum mechanics and its mathematical representations.
  • Experience with applications of eigenvalues in engineering contexts, such as vibration and stress analysis.
NEXT STEPS
  • Study the process of diagonalization of matrices, focusing on symmetric matrices.
  • Explore the applications of eigenvalues in vibration analysis and buckling analysis.
  • Learn about the role of eigenvectors in quantum mechanics, particularly in relation to observables.
  • Investigate Noether's theorem and its implications for symmetries in physical systems.
USEFUL FOR

Mathematicians, physicists, engineers, and students interested in linear algebra, quantum mechanics, and applications of eigenvalues in real-world problems will benefit from this discussion.

meteor
Messages
937
Reaction score
0
What's the utility of the eigenvectors of a matrix?
I know that is something about quantum mechanics
 
Last edited by a moderator:
Physics news on Phys.org
Then you know wrong.


It is true that, in one form of Quantum Mechanics, we can represent "operators" by (infinite dimensional) matrices and then all properties of matrices and linear operators are important.


One point of the "eigenvalue, eigenvector" problem is this:

If we can find a "complete set of eigenvectors" (a basis for the vector space consisting entirely of eigenvectors) then the matrix has a very simple form: it is diagonal with the eigenvalues on the diagonal.

For example, the eigenvalues of the matrix A= [0 1]
[-2 3] are 1 and 2

The vector [1 1] is an eigenvector with eigenvalue 2 and the vector
[1 2] is an eigenvector with eigenvalue 1.

Let P= [1 1]
[1 2] is a matrix with columns formed by those eigenvectors. It's inverse matrix is P-1= [2 -1]
[-1 1]
Now you can calculate yourself that
P-1A P= [1 0] and so P[1 0] P-1= A.
[0 2] [0 2]


Solving equations, etc. connected with [0 1]
[-1 3] can be reduced to problems with [1 0]
[0 2] which is much simpler.
 
Eiegnvectors can also be used to find a basis for a matrix.

Fun stuff :smile:
 


Originally posted by meteor
What's the utility of the eigenvectors of a matrix?
I know that is something about quantum mechanics

Start by thinking of a simple vector in 2-D. In general, it has two components. However, there is always a coordinate system in which one of the coordinates is zero (the system in which one axis is coincident with the vector).

Now think of a 3x3 matrix as representing something more complicated than a vector (it's called a tensor but that doesn't matter here). In general, the matrix will have 9 components (in 3-D), three of which are on the 'main diagonal' (top left to bottom right) and the other six of which are not. Again, there is always a coordinate system in which the 6 'off-diagonal' components are zero. In this system, the three components on the diagonal are the eigenvalues.

They have a whole range of applications. In vibration analysis, the eigenvalues are the natural frequencies (well, 'squared' but who cares), in buckling analysis the eigenvalues are the buckled shapes, in stress analysis the eigenvalues are the 'principal stresses' that control fracture, and so on.

More abstractly, multiplying a vector by an (orthogonal) matrix generally causes both rotation and dilation of the vector. However, certain matrices cause the vector to simply dilate, without rotating, i.e:

A.x = m.x

where A is a matrix, x is the vector and m is a scalar called the eigenvalue.
 
The eigenvectors of the operators in QM allow you to obtain the quantum numbers of the system. If you consider for example the group SU(2), the eigenvalues of its Casimir operator on the irreducible representations allows you to obtain the quantum number. This elementary example led to the operator language in which the modern physical theories are modeled.
 
Another simple example: If you have a rotation in space, then each vector that is along the axis, is an eigenvector. Since it's not affected by the rotation.
Similar to that, there is Noether's theorem which IIRC states that each symmetry generates an invariant. E.g. if a system is statistical then it has constant energy.
(Just a rough sketch of what it has to do with physics. Some may say I'm totally wrong here...)
 


Originally posted by rdt2
Now think of a 3x3 matrix as representing something more complicated than a vector (it's called a tensor but that doesn't matter here). In general, the matrix will have 9 components (in 3-D), three of which are on the 'main diagonal' (top left to bottom right) and the other six of which are not. Again, there is always a coordinate system in which the 6 'off-diagonal' components are zero. In this system, the three components on the diagonal are the eigenvalues.

This is not true! Not all matrices are diagonalizable!
A restriction to symmetric matrices would be more appropriate here as an illustration of the physical properties of eigenvalues...
 


Originally posted by dg
This is not true! Not all matrices are diagonalizable!
A restriction to symmetric matrices would be more appropriate here as an illustration of the physical properties of eigenvalues...

Point taken! I seldom have to deal with non-symmetric matrices.

ron.
 
Any normal M*M matrix is diagonalisable.
 
  • #10
In QM:
Eigenvectors of a matrix give the pure states for an observable. As the set of states is convex, this means that the states can be obtained from the eigenvectors.
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K