# Eigenfunction expansions

1. Mar 29, 2013

### I<3NickTesla

In the picture attached I understand everything up to 1.12. I wrote "think of it like a matrix" at the time and that made sense but now I don't really get it. There's obviously an analogy between decomposing a matrix into its eigenvector basis and a function into its eigenfunction basis but I'm not really seeing it.Thanks.

#### Attached Files:

• ###### whatwhatwhat.jpg
File size:
76 KB
Views:
166
2. Mar 29, 2013

### csopi

Of course, there is an analogy: a matrix can be thought of as a linear map from a finite dimensional vector space to an other one. A linear operator is "the same", in the sense, that it is a map from an (infinite) vector space to an other one.

But this is not, what you need know. What you use in 1.12 is the so called completeness property of the eigenfunctions:

$$\sum f(x) f(x')=\delta(x-x')$$

3. Mar 29, 2013

### jasonRF

Let $A$ be an $N\times N$ Hermitian matrix - that is $A=A^\dagger$. Here $\dagger$ means conjugate transpose if $A$ is complex, and simply means transpose if A is real. By the spectral theorem of linear algebra, $A$ has a complete set of orthonormal eigenvectors, each of which satisfy $A v_n = \lambda_n v_n$. The orthonormal means $(v_m, v_n) = \delta_{m,n}$ where $(x, y)$ is the inner product of two vectors x and y, and $\delta_{m,n}$ is one if m=n and zero otherwise. Since there are N orthornormal eigenvectors, they must span our N dimensional space, so any vector can be represented as a sum of the eigenvectors.

Now, consider
$$A x = b$$
where we know $b$ but not $x$. The idea is to expand both $b$ and $x$ in the eigenvectors,
$$\begin{eqnarray} x & = & \sum_{n=1}^N x_n v_n \\ b & = & \sum_{n=1}^N b_n v_n \end{eqnarray}$$
From orthonormality, we can find the $b_n$ (these are just numbers)
$$b_n = (v_n,b).$$
We can then solve for the $x_n$ as follows. We start with Ax=b, but substituting the series
$$A \sum_{n=1}^N x_n v_n = \sum_{n=1}^N (v_n,b) v_n.$$
The left hand side is then,
$$\begin{eqnarray} A \sum_{n=1}^N x_n v_n & = & \sum_{n=1}^N x_n A v_n \\ & = & \sum_{n=1}^N x_n \lambda_n v_n \end{eqnarray}$$
Ax=b can therefore be written,
$$\sum_{n=1}^N x_n \lambda_n v_n = \sum_{n=1}^N (v_n,b) v_n$$
Take inner product with $v_m$ yields
$$x_m \lambda_m = (v_m,b)$$
so
$$x_m = \frac{(v_m,b)}{\lambda_m}.$$
The solution is therefore
$$x = \sum_{n=1}^N \frac{v_n (v_n,b)}{\lambda_n}.$$
This is the eigenvector expansion solution of the non-homogeneous linear system. It is analogous to the eigenvector expansion solution to the non-homogeneous SL problem (note the SL operator is Hermitian).

We almost have the Green's function analog, too. Note that the normal inner product is $(x,y)=x^\dagger y$, so we can write,
$$\begin{eqnarray} x & = & \sum_{n=1}^N \frac{v_n v_n^\dagger b}{\lambda_n} \\ & = & \left( \sum_{n=1}^N \frac{v_n v_n^\dagger }{\lambda_n} \right) b. \end{eqnarray}$$
The final quantity in parentheses is a matrix that is analogous to green's functions for SL.

hope this helps!

4. Apr 12, 2013

### I<3NickTesla

That helped, thanks