Register to reply

I don't get Eigenvalues or Eigenvectors

by Alex6200
Tags: eigenvalues, eigenvectors
Share this thread:
Saladsamurai
#19
Jun14-08, 06:02 PM
Saladsamurai's Avatar
P: 3,015
Quote Quote by Alex6200 View Post
Well, I took differential equations at the community college before heading off to college (Worcester Polytechnic Institute, Electrical Engineering), and I'm going to take Linear Algebra first semester there. I couldn't take linear algebra at my high school or local community college.

The thing is, I understand eigenvalues in a formal sense. I know how to use them to solve differential equations. I know the definitions, I just don't... get them, if you know what I mean. Like, I can't just look at a matrix and think "Wow, that matrix must have some big eigenvalues", the way I'd look at a parabola and immediately get a sense of what it would look like.
Alex, what community college did you go to? Was it NSCC by chance? If so, I may have been in that class with you.
Alex6200
#20
Jun14-08, 11:37 PM
P: 75
Quote Quote by Saladsamurai View Post
Alex, what community college did you go to? Was it NSCC by chance? If so, I may have been in that class with you.
Ugh, Carroll.
Saladsamurai
#21
Jun15-08, 11:28 PM
Saladsamurai's Avatar
P: 3,015
Quote Quote by Alex6200 View Post
Ugh, Carroll.
Is Carroll the name of the college? Or are you asking me if my name is Carroll? Because it's not, it's Casey.
Alex6200
#22
Jun16-08, 09:08 AM
P: 75
No, Carroll's the name of the college.
Howers
#23
Jul16-08, 10:36 AM
P: 444
edit
Howers
#24
Jul16-08, 11:31 AM
P: 444
If this guy hasn't done any linear algebra how do you expect him to know about linear maps and even analytical geometry? I will offer a layman explanation, and hopefully you can build up on it when you take linear algebra.

First of all, you need to know that vectors of Rn can be represented as column matrices. For example, the vector (1,-4,3) would look like this in linear algebra: [tex]\begin{pmatrix}
1 \\
-4 \\
3 \\
\end{pmatrix}
[/tex]

Eigen is german for "the same". So when you are working with a matrix A, you are trying to find an X and [tex]\lambda[/tex] such that AX=[tex]\lambda[/tex]X. Here, X is the vector (represented as a colum matrix) and [tex]\lambda[/tex] is some number. It should be clear that when multiplying matrices A (nxn) and X (nx1) you get an nx1 matrix in return. As you know, multiplying a scalar by X will also give you an nx1 matrix. So you are trying to find an X, called the eigenvector, so that AX is equivalent to X multiplied by some scalar, [tex]\lambda[/tex], called the eingenvalue. Sometimes more than one X will work for a certain [tex]\lambda[/tex].

To find eigenvectors, you work with AX = [tex]\lambda[/tex]X, which is the same as AX = [tex]\lambda[/tex]IX, which is the same as (A - [tex]\lambda[/tex]I)X = 0. Obviously X=0 works for this, but by defintion no eigen can ever be zero. The only way to guarentee that there exist some X other than zero is to make (A - [tex]\lambda[/tex]I) not invertible.

Theorem: Suppose BX=0. If B is invertible, then X=0 is the only solution.
Proof: If B is invertible, then there is some B- such that B-B=I.
So X = IX = B-BX = B-(BX) = 0, because BX=0.

See? The theorem tells us (A - [tex]\lambda[/tex]I) must not be invertible otherwise we will only have X=0 as an eingenvector, which is not allowed! The way you do this is make the determinant zero. There is a theorem is linear algebra that says A is not invertible iff its determinant is zero. So thats what you are doing when finding eigenvalues, expanding the determinant in terms of [tex]\lambda[/tex] and making it equal zero. Because only
(A - [tex]\lambda[/tex]I) with those [tex]\lambda[/tex] are non-invertible, so they are the only ones with nonzero X.

Once you have your eigenvalues, you find out which Xs work which you already know how to do. Ultimately, there are infinitely many Xs that work so we overcome this problem by excluding all other scalar multiples of each eigenvector. So you've found an eigenvector so that AX is the same as [tex]\lambda[/tex]X, pretty weird huh? A matrix acting like a scalar towards some X! Each [tex]\lambda[/tex] has its own eigenvectors, and sometimes more than 1 are possible.

Eigenvalues and eigenvectors are used in linear algebra to diagonalize matrices, that is to make them into a nice and cute form. Certain matrices have predictable eigenvalues. Others not so predictable. You will learn other ways to use them in linear algebra. For now, I hope this clears things up.
insanenoodle
#25
Oct14-08, 10:59 AM
P: 1
Quote Quote by trambolin View Post
Very stupid example but sometimes works on weird individuals like me... (and it is not general enough to capture all the properties) but life is too short to explain everything at one shot, here it goes...


Imagine you are pushing a box with a very weird shape and suppose you don't see the whole box so it is difficult to estimate how to push it.


You want to push it so that the direction of the force that you are applying is exactly the direction the box moves, that means force vector and the displacement vector pointing the same direction. Possibly with different quantities! What usually happens is that you get both translation and rotation if you choose a bad direction. But there might be a direction where you get pure translation. That is your eigenvector. How much translation you get is proportional to your force and that is your eigenvalue, with respect to the that direction (or eigenvector) Now let A be your matrix explaining the relations between individual directions and translations you getin each direction, then after some concrete argument, you can show that there are some directions that your biiiiig matrix really acts like scalar, (you get only translation on that particular direction)


Now, if it makes sense, read the rigorous arguments above again. If not, forget about these immediately :)
My God. This explanation is quite possibly the best I've ever heard. Looking at it in the way you mentioned makes explaining eigenvalue/vectors much more intuitive than throwing numbers and vague terms. Better than the ways any of my professors have tried explaining it. This coming from a junior at carnegie mellon university...
Tac-Tics
#26
Oct14-08, 02:34 PM
P: 810
Quote Quote by Alex6200 View Post
But linear algebra just doesn't make sense to me.
I have the same problem some times (though I've studied linear algebra for a few years now).

Sometimes, I think what we really want isn't so much to know what something is, but to figure out an intuitive understanding of it. You know what an eigenvector of an operator is. It's any vector whose direction is not changed under that operator (only it's length). And then eigenvalues are the scalars that give the change in length.

But what does it physically signify? I'm not sure myself (I was actually browsing through Wikipedia earlier today for just this reason). But eigenvectors and eigenvalues have lots of super weird properties. The sum of the eigenvalues is the trace of an operator. The product is the determinant. The determinant is an expression that "determines" the number of solutions (values for x) to the equation Tx = y (where T is a transformation and y is a vector). The determinant is also used in finding the inverse transformation.

But I don't understand this crap either, so I'll leave you with this instead!


What is the matrix?
Neo: What is the Matrix?
Trinity: The answer is out there, Neo, and it's looking for you, and it will find you if you want it to.
crasshopper
#27
Feb28-11, 08:08 PM
P: 3
There are several more good answers on Quora.

http://www.quora.com/What-is-the-bes...igenvalues-are

(admin - sorry if this is against link policy)
MathAmateur
#28
Mar1-11, 12:12 AM
P: 32
The classic example is the Earth rotating. If one can create a 3-D vector for each point on the surface of the Earth. As the Earth rotates, all but 2 of these vectors will change direction. The two that don't change directions (on the North and South Poles), are the Eigenvectors. And since they also don't move, their Eigenvalues are one.
HallsofIvy
#29
Mar1-11, 07:48 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 39,310
How did you unearth this thread (and why)? It is about 2 and a half years old.

And you example isn't very clear. You say "create a 3-D vector for each point on the surface of the Earth" but you appear to be assuming, without saying it, that the vector at each point is pointing directly away from the center of the earth.
mathwonk
#30
Mar1-11, 03:19 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,453
Thats true Halls, but, at least for someone that understands it, it is a pretty nice example, and by now the OP is probably long gone....! So we can enjoy it and think about how to use it next time this is asked. best wishes.


Register to reply

Related Discussions
Eigenvalues/Eigenvectors Calculus & Beyond Homework 2
Eigenvalues and eigenvectors Linear & Abstract Algebra 10
Eigenvalues and Eigenvectors Advanced Physics Homework 2
Eigenvalues and eigenvectors General Math 2
Eigenvalues and Eigenvectors Linear & Abstract Algebra 23