Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvalues and eigenvectors

  1. Oct 30, 2013 #1
    Hello guys, is there any way someone can explain to me in resume what eigen values and eigenvectors are because I don't realy recall this theme from linear algebra, and i'm not getting intuition on where does Fourier transform comes from.

    my teacher wrote:
    A[itex]\overline{v}[/itex] = λ[itex]\overline{v}[/itex]

    then he said that for a vector [itex]\overline{x}[/itex]

    [itex]\overline{x}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] xi [itex]\overline{e}[/itex]i

    and he calls this [itex]\overline{ei}[/itex] the inicial ortonormal base

    the he says that this is equal to

    [itex]\overline{x}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i

    where [itex]\overline{v}[/itex]i is the base of the eigenvectors of A


    then he says that y=A[itex]\overline{x}[/itex]

    [itex]\overline{y}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] yi [itex]\overline{e}[/itex]i = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{y}[/itex]i [itex]\overline{v}[/itex]i = A[itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] A[itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i = itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i A [itex]\overline{v}[/itex]i = as A[itex]\overline{v}[/itex]i is λ[itex]\overline{v}[/itex]i = itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i λi [itex]\overline{v}[/itex]i

    So we get that [itex]\widehat{x}[/itex]i λ = [itex]\widehat{y}[/itex]i

    I Would like to know the intuition behind this and how it relates to the Fourier Series/ Fourier Transform.
    I'd really apreciate not to go into deep mathematics once I have very very weak Linear Algebra bases and I will have to waste some time relearning it, but unfortunately I don't have time now.


    Hope someone can help!

    Thanks in advance!

    Pedro
     
  2. jcsd
  3. Oct 30, 2013 #2

    UltrafastPED

    User Avatar
    Science Advisor
    Gold Member

    In linear algebra you will have "diagonalized the matrix" towards the end of the term; this process finds the eigenvalues (the terms on the diagonal) and the eigenvectors (the new set of basis vectors for the system).

    Thus if you can diagonalize the matrix, a complete set of eigenvectors will exist; they have very nice analytical properties. They correspond to the physical "modes of the system" - if you bang something in the same direction as one of its eigenvectors, then it will only respond in that direction; if you hit it elsewhere, you get multiple responses. That is the significance of the eigenvector equation ... used heavily in acoustics and quantum mechanics, among others.
     
  4. Oct 30, 2013 #3

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    Not all matrices have a set of eigenvectors that spans the whole vector space. As an example consider the rotation matrix in ℝ2:

    \begin{pmatrix}
    cos\theta & -sin\theta \\
    sin\theta & cos\theta
    \end{pmatrix}

    Unless ##\theta## is a multiple of ##\pi##, this matrix doesn't have eigenvectors at all!

    Usually in applications in physics and engineering, the matrices are hermitian, which guarantees a complete set of eigenvectors.
     
  5. Oct 30, 2013 #4

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Mr. Albot:
    Maybe you can look at hilbert2's example to illustrate the concept: if 1 were an eigenvalue, then a vector would be sent to itself ( or, more precisely, to the same place (x,y) where it starts; after a rotation). Clearly, like hilbert2 says, 1 can only be an eigenvalue if you rotate by an integer multiple of $$\pi$$ , and the eigenvectors would be all points that are fixed by the rotation. Notice that if $$\lambda=1$$ is an eigenvalue, that means $$Tv=v$$ , so that v is fixed by the transformation.
     
  6. Nov 1, 2013 #5
    Thanks Alot guys! I just started to study Linear Algebra from the beggining because I wasn't understanding anything you were saying, but only now I can to see how usefull your comments were! Algebra is beautifull ...Thanks alot again!
     
  7. Nov 1, 2013 #6

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    A correction to my post #4: that should be an integer multiple of ##2\pi## , not an integer multiple of ##\pi##.
     
  8. Nov 2, 2013 #7
    exactly! that makes alot more sense now, but I got the point the first time. Do you know where can I find the best place to learn the derivation of Fourier transform? Right now I am learning from khan academy once i'm a little short on time but its being a pleasant trip over linear algebra. How exactly do I map from the R^n to the complex map ?
    Best regards

    edit: what I really want to know is the derivation of laplace transform and Z transform, once Fourier comes from that.
     
    Last edited: Nov 2, 2013
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Eigenvalues and eigenvectors
Loading...