How Do Eigenvalues and Eigenvectors Connect to Fourier Transforms?

Click For Summary
Eigenvalues and eigenvectors are crucial in linear algebra, representing the scaling factors and directions of transformations applied to vectors. The discussion highlights that diagonalizing a matrix reveals its eigenvalues and eigenvectors, which have significant applications in fields like acoustics and quantum mechanics. The relationship between these concepts and Fourier transforms is noted, as both involve transforming functions into different bases for analysis. The importance of understanding these mathematical foundations is emphasized, particularly for applications in physics and engineering. Resources for learning about the derivation of Fourier, Laplace, and Z transforms are also sought.
MrAlbot
Messages
12
Reaction score
0
Hello guys, is there any way someone can explain to me in resume what eigen values and eigenvectors are because I don't really recall this theme from linear algebra, and I'm not getting intuition on where does Fourier transform comes from.

my teacher wrote:
A\overline{v} = λ\overline{v}

then he said that for a vector \overline{x}

\overline{x} = \sum^{n} _{i=1} xi \overline{e}i

and he calls this \overline{e<sub>i</sub>} the inicial ortonormal base

the he says that this is equal to

\overline{x} = \sum^{n} _{i=1} \widehat{x}i \overline{v}i

where \overline{v}i is the base of the eigenvectors of A


then he says that y=A\overline{x}

\overline{y} = \sum^{n} _{i=1} yi \overline{e}i = \sum^{n} _{i=1} \widehat{y}i \overline{v}i = A\sum^{n} _{i=1} \widehat{x}i \overline{v}i = \sum^{n} _{i=1} A\widehat{x}i \overline{v}i = itex]\sum^{n}[/itex] _{i=1} \widehat{x}i A \overline{v}i = as A\overline{v}i is λ\overline{v}i = itex]\sum^{n}[/itex] _{i=1} \widehat{x}i λi \overline{v}i

So we get that \widehat{x}i λ = \widehat{y}i

I Would like to know the intuition behind this and how it relates to the Fourier Series/ Fourier Transform.
I'd really apreciate not to go into deep mathematics once I have very very weak Linear Algebra bases and I will have to waste some time relearning it, but unfortunately I don't have time now.


Hope someone can help!

Thanks in advance!

Pedro
 
Physics news on Phys.org
In linear algebra you will have "diagonalized the matrix" towards the end of the term; this process finds the eigenvalues (the terms on the diagonal) and the eigenvectors (the new set of basis vectors for the system).

Thus if you can diagonalize the matrix, a complete set of eigenvectors will exist; they have very nice analytical properties. They correspond to the physical "modes of the system" - if you bang something in the same direction as one of its eigenvectors, then it will only respond in that direction; if you hit it elsewhere, you get multiple responses. That is the significance of the eigenvector equation ... used heavily in acoustics and quantum mechanics, among others.
 
Not all matrices have a set of eigenvectors that spans the whole vector space. As an example consider the rotation matrix in ℝ2:

\begin{pmatrix}
cos\theta & -sin\theta \\
sin\theta & cos\theta
\end{pmatrix}

Unless ##\theta## is a multiple of ##\pi##, this matrix doesn't have eigenvectors at all!

Usually in applications in physics and engineering, the matrices are hermitian, which guarantees a complete set of eigenvectors.
 
Mr. Albot:
Maybe you can look at hilbert2's example to illustrate the concept: if 1 were an eigenvalue, then a vector would be sent to itself ( or, more precisely, to the same place (x,y) where it starts; after a rotation). Clearly, like hilbert2 says, 1 can only be an eigenvalue if you rotate by an integer multiple of $$\pi$$ , and the eigenvectors would be all points that are fixed by the rotation. Notice that if $$\lambda=1$$ is an eigenvalue, that means $$Tv=v$$ , so that v is fixed by the transformation.
 
Thanks Alot guys! I just started to study Linear Algebra from the beggining because I wasn't understanding anything you were saying, but only now I can to see how usefull your comments were! Algebra is beautifull ...Thanks a lot again!
 
A correction to my post #4: that should be an integer multiple of ##2\pi## , not an integer multiple of ##\pi##.
 
exactly! that makes a lot more sense now, but I got the point the first time. Do you know where can I find the best place to learn the derivation of Fourier transform? Right now I am learning from khan academy once I'm a little short on time but its being a pleasant trip over linear algebra. How exactly do I map from the R^n to the complex map ?
Best regards

edit: what I really want to know is the derivation of laplace transform and Z transform, once Fourier comes from that.
 
Last edited:

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K