Double Concept Check Linear Algebra with Diagonlization and Eigenstuff

flyingpig
Messages
2,574
Reaction score
1

Homework Statement



I've been reading my Linear Algebra Book for an hour now and from what I've read.

Eigenvectors are vectors that satisfies Ax = λx, but I've noticed that Eigenvectors isn't the important thing here, it is the Eigenvalues that really have the "big" meaning.

Ax = λx

That is there exists a matrix that acts like a scalar. Is that what it means?

However if I do

(A - λI)x = 0

And x is the solution of the nullspace, does that mean the solution is also the eigenvectors and they form my eigenspace? What if there are more than one eigenvalues (and they aren't multiplicity)

Also, what is up with the concept of Diagonlization? The book I use just threw the definition at me and the mechanics with it and said "we found it, now you deal with it and understand it" implicitly. How did they even come up with something like this? I thought we already got the LU factorization, why do we need another one of these guys?
 
Physics news on Phys.org
flyingpig said:

Homework Statement



I've been reading my Linear Algebra Book for an hour now and from what I've read.

Eigenvectors are vectors that satisfies Ax = λx, but I've noticed that Eigenvectors isn't the important thing here, it is the Eigenvalues that really have the "big" meaning.

Ax = λx

That is there exists a matrix that acts like a scalar. Is that what it means?

Not quite, A.x always returns a vector, however when a matrix is multiplied by one of its eigenvectors, it returns a scalar multiple of the original eigenvector

eigenvectors/values have all sorts of applications across physics and matsh in linear systems
However if I do

(A - λI)x = 0

And x is the solution of the nullspace, does that mean the solution is also the eigenvectors and they form my eigenspace? What if there are more than one eigenvalues (and they aren't multiplicity)
if x is in the nullspace of A, then it is an eignevector with eigenvalue of zero

if x is in the nullspace of (A - λI)x = 0, then it is an eigenvector with eigenvalue of λ

Also, what is up with the concept of Diagonlization? The book I use just threw the definition at me and the mechanics with it and said "we found it, now you deal with it and understand it" implicitly. How did they even come up with something like this? I thought we already got the LU factorization, why do we need another one of these guys?

diagonalisation is essentially a change of basis to that of the eignevectors of the matrix, and is useful for solvinga number of problems due to the simple form and action of the matrix on the eigenvectors
 
lanedance said:
Not quite, A.x always returns a vector, however when a matrix is multiplied by one of its eigenvectors, it returns a scalar multiple of the original eigenvector

No I meant "matrix" as in the matrix A
eigenvectors/values have all sorts of applications across physics and matsh in linear systems

But their meaning in math, in this chapter I am doing right now (LA beginner here =( )


diagonalisation is essentially a change of basis to that of the eignevectors of the matrix, and is useful for solvinga number of problems due to the simple form and action of the matrix on the eigenvectors

What...?
 
ok, so take a nxn matrix A and a nx1 vector x

y = A.x returns always returns a nx1 vector y

when the multiplication returns a scalar multiple λ of x, we call x an eigenvector with corresponding eigenvalue lambda

y = A.x = λx
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top