brianparks said:
Thanks Tom for the excellent reply. Given your description, I can see how these operators would be useful in the analysis of electromagnetic phenomena.
You're welcome.
You seem to be pretty good at offering simple introductory explanations of concepts. Would you mind giving a similar explanation of a Fourier series? Its purpose, utility, and so on? Again, thanks for your help
OK, let's back it up to Day One of Mechanics I.
Vectors: Components and Basis Vectors
You learned that any vector
v can be decomposed as follows:
v=v
xex+v
yey+v
zez
The v
i (i=x,y,z) are the
components.
The
ei (i=x,y,z) are the orthonormal
basis vectors.
Vectors: Spaces and Inner Products
The vector space spanned by the basis vectors is called
R3. That is, any vector that exists in the vector space can be constructed as a linear combination of the spanning basis vectors.
We can also define an
inner product (aka dot product) on R
3 as follows:
ei.ej=
dij.
That is, the inner product of two basis vectors is 1 if i=j, and zero otherwise. This encodes the orthonormality of the vectors.
Vectors: Components Revisited
So, having defined the inner product, we can express the components of a vector as follows:
v
i=
v.ei (i=x,y,z).
Now, let's imagine that we have an enlarged vector space of n dimensions, R
n. All the above still applies, but we have more basis vectors. We can let n go to infinity, and have a countably infinite dimensional vector space, and it would all
still apply.
Now we get to Fourier series.
Fourier Series: Components and Basis Functions
An odd function f(x) can be decomposed on an interval (0,L) as follows (I'm sticking to odd functions for simplicity, I'll get to other functions later):
f(x)=Σ
na
nsin(n
px/L)
where the sum is taken from n=1 to infinity.
The a
n are the
Fourier coefficients, and we will come to see that they can be regarded as the "components" of the function in much the same way as the v
i are the components of the vector
v above.
The sin(n
px/L) are the
basis functions, and we will come to see that they can be regarded as the "basis vectors" in much the same way as the
ei are the basis vectors of R
n.
Functions: Spaces and Inner Products
I've already started to draw the parallel between vectors and functions. So, the natural question is, "What is the 'space' of functions"?
Answer: Function space, which I'll call
F.
F can be thought of as an infinite dimensional vector space (and the results from above apply for infinite dimensional vector spaces, remember?) which is spanned by the basis functions. That is, any function that exists in the function space can be constructed by a linear combination of the basis functions.
Let the basis functions be represented by their index n as follows:
fn(x)=sin(n
px/L).
We can define an
inner product on this space as follows:
<
fm(x),
fn(x)>=∫
fm(x)
f(x)dx=
dij (you can verify the last step yourself)
where the integration is taken from 0 to L. This encodes the orthonormality of the basis functions, just like the vector inner product in R
n encodes the orthonormality of the basis vectors.
Functions: Components Revisited
So, having defined the inner product, we can express the components of a function as follows:
a
j=(2/L)<
fj(x),f(x)>
(I'll leave that as an exercise).
Now, I said that the series I wrote down was for odd functions. It turns out that even functions must be represented in terms of cosine basis functions (because they are even). Functions that are neither even nor odd can be represented by series involving both sine and cosine terms.
I hope that I have made Fourier series more concrete for you by connecting it to something more familiar.
I am going to stop posting for a while, because my fingers are killing me!
edit: fixed a couple of brackets