# Orthogonality of sinusoids

## Main Question or Discussion Point

Hello,
Let's consider the $$L^2(\mathbb{R})$$ space with an inner product, and the complex sinusoids in the interval $$(-\infty,+\infty)$$.
Is it correct to say that the complex sinusoids form an orthogonal basis for this space?

One would need to have:

$$\int_{-\infty}^{+\infty}e^{ipx}e^{-iqx}dx=0$$
for any $$p\neq q$$

but if $k=p-q$, that integral is:

$$\int_{-\infty}^{+\infty}e^{ikx}dx$$

and that integral is zero only considering its Cauchy Principal Value.
Is this allowed or not?
What rigorous restriction should I include in order to say that those functions are orthogonal?

Last edited:

Related Linear and Abstract Algebra News on Phys.org
The latter integral is zero if k is not zero (which means p is not q, and that you've stated before). But plugging k=0 gives you an infinite value (integration over 1). Which means, that wrt k, the expression behaves like a delta function.
The equation:

$$\int^{\infty}_{-\infty}f_{p}(x)\bar{f}_{q}(x)dx=\delta (p-q)$$

For a set of a functions with a continuous parameter (p,q are reals) is the equivalent condition of orthonormality, which in sets with a discrete parameter looks like:

$$\int^{\infty}_{-\infty}f_{m}(x)\bar{f}_{n}(x)dx=\delta_{m,n}$$

Thanks a lot elibj123!

1) When $k\neq 0$ the integral $\int_{-\infty}^{+\infty}e^{ikx}dx$ in general does not exist. It is zero only if we consider its Cauchy Principal Value. Is this commonly accepted?

2) When $k=0$ the integral goes to infinity, so the "basis-functions" are not in $$L^2(\mathbb{R})$$. However Plancherel's Theorem states that the Fourier Transform is an isometry $$L^2(\mathbb{R})\rightarrow L^2(\mathbb{R})$$

How is it possible that the complex sinusoids $e^{ikx}$ are basis-functions of the space $L^2(\mathbb{R})$ but at the same time they are not in this space? In a vector-space the basis vectors should always be contained in the space.

Last edited:
Is it correct to say that the complex sinusoids form an orthogonal basis for this space?
It is not correct, at least not in the standard terminology of Hilbert spaces.

In a vector-space the basis vectors should always be contained in the space.
This remarks is correct.

What rigorous restriction should I include in order to say that those functions are orthogonal?
I don't know. It seems that nobody is really interested about coming up with such rigor definitions, which would allow these functions to be called orthogonal.

I have been left under impression, that the most important thing about this stuff is to understand how to prove the inverse Fourier transform with various kind of assumptions. Once you know how to prove that the inverse Fourier transform works, you can leave this orthogonality stuff on a heuristic level.

The latter integral is zero if k is not zero (which means p is not q, and that you've stated before). But plugging k=0 gives you an infinite value (integration over 1). Which means, that wrt k, the expression behaves like a delta function.
These are errorneous claims. Firstly the integral is not zero when $k\neq 0$, instead it as a divergent integral. Secondly one cannot prove that something behaves like a delta function simply because it is infinite at origo. The rate of divergence to the infinity matters very much.

It is true that if $f$ is continuous at origo, and has the bounded variation property (meaning that it can be written as a sum of two monotonous functions) in some environment of the origo, if $a<0<b$, and if $f$ is integrable over $[a,b]$, then

$$\lim_{R\to\infty} \int\limits_a^b f(k)\Big(\int\limits_{-R}^R e^{ikx} dx\Big) dk = 2\pi f(0)$$

This mathematically rigorous result is one way of giving the mysterious delta function equation some meaning.

1) When $k\neq 0$ the integral $\int_{-\infty}^{+\infty}e^{ikx}dx$ in general does not exist. It is zero only if we consider its Cauchy Principal Value. Is this commonly accepted?
What do you mean by Cauchy principal value? Doesn't it normally mean that you somehow control the rate at which some parameters approach something? Like setting $(R_1,R_2)=(-R,R)$ and $R\to\infty$, instead of simply $(R_1,R_2)\to (-\infty,\infty)$? I don't see how this would help here.

jbunniii
Homework Helper
Gold Member
2) When $k=0$ the integral goes to infinity, so the "basis-functions" are not in $$L^2(\mathbb{R})$$. However Plancherel's Theorem states that the Fourier Transform is an isometry $$L^2(\mathbb{R})\rightarrow L^2(\mathbb{R})$$
You are of course correct that the exponential functions are not in $L^2(\mathbb{R})$ (nor in $L^1(\mathbb{R})$ for that matter).

I would like to comment that the Fourier transform is indeed an isometry from $L^2 \rightarrow L^2$, but there are a lot of technicalities behind this statement.

You might correctly ask: if $f \in L^2(\mathbb{R})$, why does the defining integral for the Fourier transform of $f$, namely

$$\int_{-\infty}^\infty f(x) e^{-i\omega x}$$

necessarily exist, i.e. does the Fourier transform even exist for all $f \in L^2(\mathbb{R})$?

The answer is that it doesn't necessarily exist, at least not as defined by the above integral. In fact, since $|f(x) e^{-i\omega x}| = |f(x)|$ the integral exists if and only if $f \in L^1$. So to define the Fourier transform on $L^2$ one has to use a limiting argument.

Essentially, one proves that for any $f \in L^2$, there exists a sequence of "nice" (in this case Schwartz) functions $f_n \in L^2 \cap L^1$ that converge to $f$ in $L^2$. Since each $f_n$ is in $L^1$, the integral above defines the Fourier transform for $f_n$. The "niceness" of the functions ensure that $$\hat{f}_n \rightarrow \hat{f}$$, where $$\hat{f} \in L^2$$ is then defined as the Fourier transform of $f$.

There are a lot of details to check, most importantly that $$\hat{f}$$ is well-defined (independent of which sequence $f_n$ you choose) and that it agrees with the standard definition when $f$ is in $L^1 \cap L^2$.

Good references for this include the following:

Jones, "Lebesgue Integration on Euclidean Space"
Rudin, "Real and Complex Analysis"
Stein and Shakarchi, "Real Analysis"

By the way, there IS a well-defined way to enlarge the class of functions to accommodate taking Fourier transforms of things like $e^{i \omega x}$, including giving rigorous definition to Dirac delta "functions" and the like. This is the theory of distributions. See, e.g.

Strichartz, "A Guide to Distribution Theory and Fourier Transforms"
Rudin, "Functional Analysis"
Hörmander, "The Analysis of Linear Partial Differential Operators Vol 1: Distribution Theory and Fourier Analysis"

Personally, I own

Javier Duoandikoetxea, "Fourier Analysis".

I don't have extensive experience of different kind of Fourier Analysis books, so it could be that I'm not in a position where I should start recommending one over another, but anyway IMO Duoandikoetxea's book is good for those who are serious about mathematics. ("serious" means that you actually want to prove stuff with Fourier analysis, and not only repeat "it works!" when calculating...) For example, it actually tells how to use the assumption of bounded variation in some proofs (the secret is in certain way of using the second mean value theorem). Personally, I have increased my understanding about Fourier analysis a lot from this book, after initially being left confused by some physicists' crap.

This a book published by AMS, which is a reliable source in mathematics.