Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvectors and Eigenvalues

  1. Aug 21, 2010 #1
    Can someone explain these to me?
     
  2. jcsd
  3. Aug 21, 2010 #2
    For a given matrix A, the eigenvectors xi and the eigenvalues bi satisfy the relationship:

    A*xi=bi*xi

    Eigenvectors and eigenvalues appear frequently in Physics, for example in the study of vibrations
     
  4. Aug 21, 2010 #3
    At a basic level suppose we have a vector, v, and we multiply it by a square (n -by -n) matrix. Once we multiple A*v = w (where w is some new vector) we can look at the direction of w.

    If w is in the same direction as v it must be some multiple of v. In other words [tex]w = \lambda v[/tex] where lambda is just some scalar value.

    If this is the case we can say that [tex]Av = \lambda v[/tex] where lambda is the eigenvalue and v is an eigenvector of the matrix A
     
  5. Aug 22, 2010 #4
    I have some questions about this. If w is in the same direction as v this means that A (which is an operator represented by matrix A) don't change the direction of w, i.e the A don't rotate the vector w. How do we call such operators (or matrices)? Are they only linear transformations?
     
    Last edited: Aug 22, 2010
  6. Aug 22, 2010 #5
    I guess some people will explain the details. I just want to explain the main reason why you use eigenvectors.

    If you find enough vectors xi which satisfy
    A*xi=qi*xi
    where qi is a scalar constant, then you can write any arbitrary vector y as a sum of these
    y=t1*x1+t2*x2+t3*x3+...
    Now multiplying by the matrix A is very easy now!
    A*y=t1*q1*x1+t2*q2*x2+t3*q3*x3+...
    You see eigenvectors simplify matrix multiplications a lot.
     
  7. Aug 22, 2010 #6

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    What you are really asking for is a complete course in linear algebra!

    The basic "object of study" in Linear Algebra is the "vector space", an algebraic structure in which we have defined addition and multiplication by numbers.

    All functions, then are functions that "preserve" those operations: A(ru+ sv)= rA(u)+ sA(v), the "linear transformations".

    Further, the most fundamental property of "linear" itself is: we can "take apart" a linear problem, solve each part, then put the solutions to each part together to get a solution to the original problem. If I have to solve a problem of the form "Ax= B" then first solving "[itex]Ax= \lambda x[/itex]" shows me the best way to "take apart" the problem.
     
  8. Aug 22, 2010 #7
    I mean I've typically heard them called linear transformations, however I'm not really qualified to say if it's something else.

    I mean formally a linear transformation is something like T: f -> g where f,g are vector spaces and

    1.) T(v1+v2) = T(v1) + T(v2), where v1,v2 are in V
    2.) T(av) = aT(v)

    So I don't have a proof that those holds for eigenvalue/vectors but I fell like it is satisfied.

    Also depending on the operator the eigen problem might be part of certain mathematical groups couldn't it? I'm mainly thinking about invariant groups
     
  9. Aug 23, 2010 #8

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    A linear operator on a vector space is a function, F, such that F(au+ bv)= aF(u)+ bF(v) where u and v are vectors and a and b are scalars.

    A matrix is, of course, an array of numbers with specific addition and multiplication defined.

    We can think of n by m matrices as linear transformations from the specific vector space [itex]R^n[/itex] to the vector space [itex]R^m[/itex].

    More generally, if F is a linear transformation from a vector space of dimension n to a vector space of dimension m, then, given a specific ordered basis, it can be represented as an n by m matrix.

    But linear transformations are not the same as matrices. For one thing, the correspondence depends on a specific basis (the matrices representing the same linear transformation in different bases are "similar" matrices however). Further, there exist infinite dimensional vector spaces in which linear transformations cannot be represented by matrices. One example is the vector space of all polynomials (with the usual definition of addition of polynomials and multiplication by numbers) with the linear operator being differentiation of the polynomial.
     
  10. Aug 23, 2010 #9
    But differentiation of polynomials in R[t] can be represented by the (infinite) matrix
    [tex]\begin{bmatrix}
    0&1&0&0&\cdots\\
    0&0&2&0&\cdots\\
    0&0&0&3&\cdots\\
    \vdots&\vdots&\vdots&\vdots&\ddots\\
    \end{bmatrix}[/tex]
    relative to the standard basis {1, t, t2, ...}. So can any other linear transformation between vector spaces with countable (Hamel) bases.
     
  11. Aug 24, 2010 #10
    Well sure but if the OP doesn't know what an eigenvector/value is it's best to start with a finite dimensional case.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Eigenvectors and Eigenvalues
Loading...