What Are the Real Meanings and Purposes of Eigenvalues and Eigenvectors?

AI Thread Summary
Eigenvalues and eigenvectors simplify the analysis of square matrices by allowing them to be expressed in a diagonal form, making calculations easier. A matrix can be diagonalizable even if not all eigenvalues are unique, provided that all eigenvectors are linearly independent. Eigenvectors represent directions where the linear transformation acts as scalar multiplication, with the corresponding eigenvalue indicating the factor of stretching or compressing. The process of finding eigenvalues involves solving the determinant equation det(A - λI) = 0, linking it to the Kronecker delta in tensor calculus. Understanding these concepts is crucial for applications in various fields of mathematics and engineering.
shankar
Messages
17
Reaction score
1
can anyone explain the the real meaning and purpose of eigen vlaue and eigen vectors..

:smile:
 
Mathematics news on Phys.org
You know how easy it is to work with diagonal matrices, right?

Consider the fact that (nearly) every square matrix can, after a suitable change of basis, be written as a diagonal matrix whose entries are simply its eigenvalues.

So in one sense, using eigenvalues and eigenvectors let's you treat (nearly) any matrix similarly to a diagonal matrix, making the work easier.
 
This works providing the matrix has unique eigenvalues, right? Do we need a full set of linearly independent eigenvectors?
 
Actually, in order for a matrix to be diagonalizable, it is NOT necessary that all the eigenvalues be unique. It IS necessary that all the eigenvectors be independent- that is that there exist a basis for the vector space consisting of eigenvectors.
Essentially, the eigenvectors are those vectors on which the linear tranformation acts like simple scalar multiplication.
 
If you have some square matrix then a non-zero vector x in R^n is an eigenvector of A if Ax is a scalar multiple of x. This scalar is called an eigenvalue of A.

Q) So if you have some vector then scalar multiples of it only 'stretches' or 'compresses' it by a factor of your eigenvalue? Explain. And how can we use determinants in finding eigenvalues of a given matrix?

(off-topic) Has this got anything to do with the Kronecker Delta in Tensor Calculus?
 
Originally posted by Oxymoron
(off-topic) Has this got anything to do with the Kronecker Delta in Tensor Calculus?

Yes, it does.

When calculating the eigenvalues {λn} of a matrix A, you have to solve the equation:

det(A-λI)=0.

If we rewrite that in terms of matrix elements (IOW, with indices) we can write:

det(Aij-λIij),

the identity matrix Iij is none other than the Kronecker delta, δij.
 
finding eigen vector for the matrix A, will it give the orthogonal quantity of the matrix..
 
an n by n matrix M is diagonalizable if and only if the space R^n has a basis of eigenvectors of M, if and only if the minimal polynomial P of M consists of a product of different linear factors, if and only if the characteristic polynomial Q splits into a product of linear factors, and for each root c of Q, the kernel of M-cId has dimension equal to the power with which the factor (X-c) occurs in Q.

see http://www.math.uga.edu/~roy/
 
Back
Top