- #1

dduardo

Staff Emeritus

- 1,891

- 3

If anyone would be kind enough to post the importance of Eigenvalues and Eigenvectors, how they where developed, and possible applications for their use.

Any input is wecome.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter dduardo
- Start date

- #1

dduardo

Staff Emeritus

- 1,891

- 3

If anyone would be kind enough to post the importance of Eigenvalues and Eigenvectors, how they where developed, and possible applications for their use.

Any input is wecome.

- #2

- 841

- 1

Well, the first step in understanding anything in linear algebra is to think about it geometrically, in my opinion. Instead of matrices, think about linear transformations of vectors. i.e., think about geometric operations on vectors: think in terms of arrows rotating, stretching, shearing, etc. If you choose a basis (a set of axes), then you can write down the matrix components of the transformation in this basis; you can get infinitely many matrices (related by similarity transformations) that are all different ways of representing the same geometric transformation.

Then, consider some specific transformations, and look for vectors that are unchanged under the transformation: we are looking for*symmetry*. For instance, if you have a 3D rotation about an axis, any vector pointing along that axis will be unchanged under that rotation; that axis is a *symmetry* of the rotation. This is an example of an eigenvector.

We can relax our definition a little: let's look for vectors that are not completely unchanged by a transformation, but just have their direction unchanged. (Well, we'll count reversing direction as leaving the direction unchanged; it still points along the same line.) For example, if you have a transformation that stretches a vector in one direction and squashes it in another. Then those two directions are also eigenvectors of the transformation --- e.g., a vector pointing purely in the "stretch" direction gets stretched that way; it gets squashed the other way, but it has no component in the other direction, so the squashing doesn't do anything to it. It remains pointing in the stretched direction. The eigenvalues of those eigenvectors are the amount of stretching and squashing, respectively.

Now, if you looked at this transformation from the perspective of a matrix in a basis that didn't point along these directions, this nice geometric property would be obscured. But if you choose a basis consisting of eigenvectors of the transformation, then the matrix becomes simple: it's just diagonal, with the diagonal components being the eigenvalues!

(That's because a column of a matrix represents how the transformation acts on one of the basis vectors; if you choose your basis axis to be the eigenvectors, then the eigenvectors will have only one nonzero component, and since they remain in the same direction after the transformation, the transformed basis vectors will also have one nonzero component.)

Example: suppose I choose a linear transformation T(u+v) = 5u + 3v, assuming u and v are orthogonal. This transformation stretches vectors parallel to 'u' by a factor of 5, and it stretches vectors parallel to 'v' by a factor of 3. In this basis, 'u' has components [1 0]^{T} because it is written as u = 1 x u + 0 x v, and similarly 'v' has components [0 1]^{T}. After transformation, u' = T(u) = 5u = [5 0]^{T}, v' = T(v) = 3v = [0 3]^{T}. So in the {u,v} basis, the transformation T is represented by a diagonal matrix with components,

[5 0]

[0 3]

Not all matrices have a nice real set of eigenvalues so they can't all be diagonalized like this. But when they can, it makes the matrix easy to deal with. The eigenvectors tell you which basis will give you you the easy diagonal matrix. Even when it can't be diagonalized, eigenvectors are a good way to find out in which directions the transformation is "simple", and it can simplify calculations.

Example: if you want to compute a matrix power A^{n}, you could multiply it with itself over and over. Or, you could find the eigenbasis which makes it diagonal (if such a basis exists): raising that matrix to a power is trivial, since you're just raising the diagonal elements to a power. Then, if you want, you can change the answer back to whatever basis you were originally using.

Or, in physics, angular momentum L and angular velocity ω are related by the inertia tensor (matrix) I, by L = Iω The eigenvectors of the inertia tensor are the "principal axes" of a body: if you rotate the body about those axes, then the angular momentum of the body will point in the same direction as the axis of rotation. So you can decompose a general rotation into rotations about those axes, and analyze them separately: they "decouple" from each other (like in the stretching/squashing example, the stretching in one direction and the squashing in another are independent of each other). You also do this "decoupling" to find normal modes of oscillation and such, in other applications: characteristic resonant behavior of a body.

I'll also throw in a few recommendations for my favorite linear algebra books:

Anton,*Elementary Linear Algebra*

Axler,*Linear Algebra Done Right*

Strang,*Introduction to Linear Algebra*

If matrix algebra seems like an unmotivated bunch of meaningless, mindless manipulations on a pile of numbers, these books will help. The secret is, as I said, looking at the geometry of linear transformations acting on abstract vector spaces, not in matrix gymnastics.

By the way, linear algebra lies behind*everything* in physics: it's one of the most underrated math courses. Quantum mechanics is just linear algebra on infinite-dimensional vectors. Special relativity involves linear algebra on 4D spacetime vectors (Lorentz transformations are analogous to rotations). General relativity involves lots of tensors, which are generalizations of matrices that act on more than one vector at a time (multilinear transformations). Mechanics has inertia tensors, elasticity tensors, normal modes of wave equations ... electromagnetism has field tensors ... and so on. I had a particularly strong linear algebra background; my linear algebra course has served me better than any other single course that I've ever taken.

Then, consider some specific transformations, and look for vectors that are unchanged under the transformation: we are looking for

We can relax our definition a little: let's look for vectors that are not completely unchanged by a transformation, but just have their direction unchanged. (Well, we'll count reversing direction as leaving the direction unchanged; it still points along the same line.) For example, if you have a transformation that stretches a vector in one direction and squashes it in another. Then those two directions are also eigenvectors of the transformation --- e.g., a vector pointing purely in the "stretch" direction gets stretched that way; it gets squashed the other way, but it has no component in the other direction, so the squashing doesn't do anything to it. It remains pointing in the stretched direction. The eigenvalues of those eigenvectors are the amount of stretching and squashing, respectively.

Now, if you looked at this transformation from the perspective of a matrix in a basis that didn't point along these directions, this nice geometric property would be obscured. But if you choose a basis consisting of eigenvectors of the transformation, then the matrix becomes simple: it's just diagonal, with the diagonal components being the eigenvalues!

(That's because a column of a matrix represents how the transformation acts on one of the basis vectors; if you choose your basis axis to be the eigenvectors, then the eigenvectors will have only one nonzero component, and since they remain in the same direction after the transformation, the transformed basis vectors will also have one nonzero component.)

Example: suppose I choose a linear transformation T(u+v) = 5u + 3v, assuming u and v are orthogonal. This transformation stretches vectors parallel to 'u' by a factor of 5, and it stretches vectors parallel to 'v' by a factor of 3. In this basis, 'u' has components [1 0]

[5 0]

[0 3]

Not all matrices have a nice real set of eigenvalues so they can't all be diagonalized like this. But when they can, it makes the matrix easy to deal with. The eigenvectors tell you which basis will give you you the easy diagonal matrix. Even when it can't be diagonalized, eigenvectors are a good way to find out in which directions the transformation is "simple", and it can simplify calculations.

Example: if you want to compute a matrix power A

Or, in physics, angular momentum L and angular velocity ω are related by the inertia tensor (matrix) I, by L = Iω The eigenvectors of the inertia tensor are the "principal axes" of a body: if you rotate the body about those axes, then the angular momentum of the body will point in the same direction as the axis of rotation. So you can decompose a general rotation into rotations about those axes, and analyze them separately: they "decouple" from each other (like in the stretching/squashing example, the stretching in one direction and the squashing in another are independent of each other). You also do this "decoupling" to find normal modes of oscillation and such, in other applications: characteristic resonant behavior of a body.

I'll also throw in a few recommendations for my favorite linear algebra books:

Anton,

Axler,

Strang,

If matrix algebra seems like an unmotivated bunch of meaningless, mindless manipulations on a pile of numbers, these books will help. The secret is, as I said, looking at the geometry of linear transformations acting on abstract vector spaces, not in matrix gymnastics.

By the way, linear algebra lies behind

Last edited:

- #3

dduardo

Staff Emeritus

- 1,891

- 3

I had no idea that a basis was a set axis. Now it makes sense that the number of vectors in the basis define the dimesion of the vector space.

- #4

HallsofIvy

Science Advisor

Homework Helper

- 41,847

- 966

The basic theory of linear differential equations IS linear algebra: the set of all solutions to a linear homogeneous differential equations forms a vector space. And 90% of solving non-linear differential equations consists of reducing them to linear equations!

Linear algebra really is the theory of "linear problems".

- #5

- 1

- 0

newbie here

whats the sgnificance of eigenvectors in terms of decribing oscillation?

whats the sgnificance of eigenvectors in terms of decribing oscillation?

- #6

- 841

- 1

Originally posted by robinyau

whats the sgnificance of eigenvectors in terms of decribing oscillation?

Eigenvectors in a linear mechanical system describe normal modes of oscillation, which are resonsant modes that "decouple" from each other (oscillate independently of each other); you can describe the general motion as a superposition of these modes. See, for instance,

http://othello.mech.northwestern.edu/ea3/book/modes1/modes.html

- #7

- 1

- 0

Dan

- #8

- 841

- 1

Originally posted by Dan_potato

hi i'm a newie also but speaking of eigenvectors whats the importance of normalizing eigenvectors?

It's just convenient. An eigenvector isn't uniquely defined, since you can multiply any eigenvector by any number and get another eigenvector. All the eigenvectors obtained that way correspond to the same eigenvalue (as long as you don't multiply by zero). People like to single out one of them as "representative" of the bunch, and since they differ only in their lengths (up to a sign), the simplest way to do that is to pick the one with unit length.

- #9

- 6

- 0

ok, so now i understand the importance of normalising eigenvectors, but how do you actually do this operation.

for example given the matrix

3 0 0

5 4 0

3 6 1

i calculate the eigenvalues to be 3, 4 and 1

using eigenvalue = 3

i get an eigenvector of

k

-13.5k

-5k

so how do you normalise this, any help would be grately appreciated.

- #10

- 841

- 1

Originally posted by bracey

i get an eigenvector of

k

-13.5k

-5k

Are you sure? I get an eigenvector of,

[tex]

\begin{pmatrix}1 \\ -5 \\ -13.5 \end{pmatrix}

[/tex]

so how do you normalise this,

The same way you normalize any vector: divide it by its magnitude.

- #11

- 6

- 0

so if i got this right, then the normalsised eigenvector should be

0.069

-0.346

-0.935

- #12

- 841

- 1

Yes, that's right.

- #13

- 6

- 0

thanks for the help Ambitwistor, much appreciated

- #14

- 5

- 0

Im abit stuck on working out eigenvectors for 3 by 3 matrices, i get confused when trying to get the realtionships between the three values. Hope that makes sense!

- #15

- 841

- 1

- #16

- 5

- 0

1

-5

-13.5

i can get the equations out of the matrix ok but then get stuck

- #17

- 841

- 1

3x = 3x

5x+4y = 3y

3x+6y+z = 3z

The first equation gives x=1. The second gives 5+4y=3y, or y=-5. The third gives 3-30+z = 3z, or 2z = -27, or z = -13.5.

- #18

- 5

- 0

so you get [3x,5x+4y,3x+6y+z] again but this time it has to equal to 4

3x = 4x

5x+4y = 4y

3x+6y+z = 4z

so the first gives x=1.3 the second gives y=0 and the third gives z=1.3 again

Have i got that right or have i just made an idiot of myself!! if i have then i blame the hard day i have had!!

- #19

- 841

- 1

3x = 4x

5x+4y = 4y

3x+6y+z = 4z

so the first gives x=1.3 the second gives y=0 and the third gives z=1.3 again

The only solution to the equation 3x = 4x is x=0. That makes the second equation 4y = 4y or y=1. The third equation becomes 6+z=4z, or 3z=6, or z=2.

- #20

- 5

- 0

thank you for the help i will try a few more make sure i got it.

- #21

- 219

- 0

-3 7 -5

2 4 3

1 2 2

I just have no idea about how to get them through the normal way of taking the determinant of A-lambdaI equal to zero. Any help would be much appreciated!

My next problem is that when I'm looking at the eigenspace and then doing x+y+z=0 etc to work out what x, y and z need to equal to form an eigenvector, what happens if all 3 eqts are the same? Here's what I have-

6x-2y-4z

3x-y-2z

6x-2y-4z

all equal to zero. Do I take x,y and z as being zero or what? So confused!

- #22

selfAdjoint

Staff Emeritus

Gold Member

Dearly Missed

- 6,852

- 10

-3 7 -5

2 4 3

1 2 2

subtract λ from each of the numbers on the main diagonal. That is the same as subtracting λI from your matrix.

Now you have

(-3-λ) 7 -5

2 (4-λ) 3

1 2 (2-λ)

Now go through the steps for computing the determinant of this matrix, keeping the lambda factors in parenthesis. You will get a numeric expression with the lambda factors in it. Multiply the factors out and collect terms on powers of lambda. You now have a polynomial in lambda. Set it equal to zero and solve the equation.

- #23

- 219

- 0

- #24

Njorl

Science Advisor

- 267

- 14

If I ever had to teach it, I would dress in a different outlandish costume each day - lederhosen one day, matador outfit the next, then perhaps nothing but a loin cloth. That might keep the students awake.

Oddly enough, it is very important to know if you go into physics or any field with extensive mathematical modelling.

Njorl

Share: