Understanding Eigenvectors and Eigenvalues in Linear Transformations

In summary: I mean formal a linear transformation is something like T: f -> g where f,g are vector spaces and1.) T(v1+v2) = T(v1) + T(v2), where v1,v2 are in V2.) T(av) = aT(v)In summary, you can think of a linear transformation as a function that preserves the addition and multiplication of vectors.
  • #1
porcupine6789
17
0
Can someone explain these to me?
 
Physics news on Phys.org
  • #2
For a given matrix A, the eigenvectors xi and the eigenvalues bi satisfy the relationship:

A*xi=bi*xi

Eigenvectors and eigenvalues appear frequently in Physics, for example in the study of vibrations
 
  • #3
At a basic level suppose we have a vector, v, and we multiply it by a square (n -by -n) matrix. Once we multiple A*v = w (where w is some new vector) we can look at the direction of w.

If w is in the same direction as v it must be some multiple of v. In other words [tex]w = \lambda v[/tex] where lambda is just some scalar value.

If this is the case we can say that [tex]Av = \lambda v[/tex] where lambda is the eigenvalue and v is an eigenvector of the matrix A
 
  • #4
Feldoh said:
If w is in the same direction as v it must be some multiple of v. In other words [tex]w = \lambda v[/tex] where lambda is just some scalar value.

If this is the case we can say that [tex]Av = \lambda v[/tex] where lambda is the eigenvalue and v is an eigenvector of the matrix A

I have some questions about this. If w is in the same direction as v this means that A (which is an operator represented by matrix A) don't change the direction of w, i.e the A don't rotate the vector w. How do we call such operators (or matrices)? Are they only linear transformations?
 
Last edited:
  • #5
I guess some people will explain the details. I just want to explain the main reason why you use eigenvectors.

If you find enough vectors xi which satisfy
A*xi=qi*xi
where qi is a scalar constant, then you can write any arbitrary vector y as a sum of these
y=t1*x1+t2*x2+t3*x3+...
Now multiplying by the matrix A is very easy now!
A*y=t1*q1*x1+t2*q2*x2+t3*q3*x3+...
You see eigenvectors simplify matrix multiplications a lot.
 
  • #6
What you are really asking for is a complete course in linear algebra!

The basic "object of study" in Linear Algebra is the "vector space", an algebraic structure in which we have defined addition and multiplication by numbers.

All functions, then are functions that "preserve" those operations: A(ru+ sv)= rA(u)+ sA(v), the "linear transformations".

Further, the most fundamental property of "linear" itself is: we can "take apart" a linear problem, solve each part, then put the solutions to each part together to get a solution to the original problem. If I have to solve a problem of the form "Ax= B" then first solving "[itex]Ax= \lambda x[/itex]" shows me the best way to "take apart" the problem.
 
  • #7
maxverywell said:
I have some questions about this. If w is in the same direction as v this means that A (which is an operator represented by matrix A) don't change the direction of w, i.e the A don't rotate the vector w. How do we call such operators (or matrices)? Are they only linear transformations?

I mean I've typically heard them called linear transformations, however I'm not really qualified to say if it's something else.

I mean formally a linear transformation is something like T: f -> g where f,g are vector spaces and

1.) T(v1+v2) = T(v1) + T(v2), where v1,v2 are in V
2.) T(av) = aT(v)

So I don't have a proof that those holds for eigenvalue/vectors but I fell like it is satisfied.

Also depending on the operator the eigen problem might be part of certain mathematical groups couldn't it? I'm mainly thinking about invariant groups
 
  • #8
A linear operator on a vector space is a function, F, such that F(au+ bv)= aF(u)+ bF(v) where u and v are vectors and a and b are scalars.

A matrix is, of course, an array of numbers with specific addition and multiplication defined.

We can think of n by m matrices as linear transformations from the specific vector space [itex]R^n[/itex] to the vector space [itex]R^m[/itex].

More generally, if F is a linear transformation from a vector space of dimension n to a vector space of dimension m, then, given a specific ordered basis, it can be represented as an n by m matrix.

But linear transformations are not the same as matrices. For one thing, the correspondence depends on a specific basis (the matrices representing the same linear transformation in different bases are "similar" matrices however). Further, there exist infinite dimensional vector spaces in which linear transformations cannot be represented by matrices. One example is the vector space of all polynomials (with the usual definition of addition of polynomials and multiplication by numbers) with the linear operator being differentiation of the polynomial.
 
  • #9
But differentiation of polynomials in R[t] can be represented by the (infinite) matrix
[tex]\begin{bmatrix}
0&1&0&0&\cdots\\
0&0&2&0&\cdots\\
0&0&0&3&\cdots\\
\vdots&\vdots&\vdots&\vdots&\ddots\\
\end{bmatrix}[/tex]
relative to the standard basis {1, t, t2, ...}. So can any other linear transformation between vector spaces with countable (Hamel) bases.
 
  • #10
HallsofIvy said:
A linear operator on a vector space is a function, F, such that F(au+ bv)= aF(u)+ bF(v) where u and v are vectors and a and b are scalars.

A matrix is, of course, an array of numbers with specific addition and multiplication defined.

We can think of n by m matrices as linear transformations from the specific vector space [itex]R^n[/itex] to the vector space [itex]R^m[/itex].

More generally, if F is a linear transformation from a vector space of dimension n to a vector space of dimension m, then, given a specific ordered basis, it can be represented as an n by m matrix.

But linear transformations are not the same as matrices. For one thing, the correspondence depends on a specific basis (the matrices representing the same linear transformation in different bases are "similar" matrices however). Further, there exist infinite dimensional vector spaces in which linear transformations cannot be represented by matrices. One example is the vector space of all polynomials (with the usual definition of addition of polynomials and multiplication by numbers) with the linear operator being differentiation of the polynomial.

Well sure but if the OP doesn't know what an eigenvector/value is it's best to start with a finite dimensional case.
 

1. What are eigenvectors and eigenvalues?

Eigenvectors and eigenvalues are mathematical concepts that are commonly used in linear algebra. Eigenvectors are special vectors that do not change direction when multiplied by a linear transformation. Eigenvalues are the corresponding scalars that represent how much the eigenvectors are scaled by the transformation.

2. What are the applications of eigenvectors and eigenvalues?

Eigenvectors and eigenvalues have a wide range of applications in various fields such as physics, engineering, and data analysis. They are used to solve systems of differential equations, find stable systems in physics, and reduce the dimensionality of data in machine learning.

3. How do you find eigenvectors and eigenvalues?

To find the eigenvectors and eigenvalues of a matrix, you need to solve the characteristic equation, which is obtained by setting the determinant of the matrix minus a multiple of the identity matrix equal to 0. The resulting eigenvalues can then be used to find the corresponding eigenvectors.

4. What is the significance of the eigenvectors with the largest eigenvalues?

The eigenvector with the largest eigenvalue is known as the dominant eigenvector and is of particular interest because it represents the direction of the greatest change in a system. This is useful in applications such as principal component analysis, where the dominant eigenvector represents the most important dimension of a dataset.

5. Can a matrix have more than one set of eigenvectors and eigenvalues?

Yes, a matrix can have multiple sets of eigenvectors and eigenvalues, as long as they are linearly independent. This means that there can be multiple solutions to the characteristic equation, resulting in different eigenvalues and eigenvectors. However, each set of eigenvectors and eigenvalues will still satisfy the definition of an eigenvector and eigenvalue for that particular matrix.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
793
  • Linear and Abstract Algebra
Replies
24
Views
593
  • Linear and Abstract Algebra
Replies
22
Views
851
  • Linear and Abstract Algebra
Replies
3
Views
3K
Replies
7
Views
983
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
874
  • Calculus and Beyond Homework Help
Replies
5
Views
497
  • Linear and Abstract Algebra
Replies
6
Views
1K
Back
Top