Are Complex Sinusoids an Orthogonal Basis for L^2(\mathbb{R}) Space?

In summary, the conversation discusses the concept of complex sinusoids forming an orthogonal basis for the L^2(\mathbb{R}) space with an inner product. However, there are some restrictions and technicalities involved in proving this, such as the need for the Cauchy Principal Value and the existence of the Fourier transform for functions in L^2. The use of "nice" functions and the theory of distributions are also mentioned as tools for defining the Fourier transform in a rigorous manner.
  • #1
mnb96
715
5
Hello,
Let's consider the [tex]L^2(\mathbb{R})[/tex] space with an inner product, and the complex sinusoids in the interval [tex](-\infty,+\infty)[/tex].
Is it correct to say that the complex sinusoids form an orthogonal basis for this space?

One would need to have:

[tex]\int_{-\infty}^{+\infty}e^{ipx}e^{-iqx}dx=0[/tex]
for any [tex]p\neq q[/tex]

but if [itex]k=p-q[/itex], that integral is:

[tex]\int_{-\infty}^{+\infty}e^{ikx}dx[/tex]

and that integral is zero only considering its Cauchy Principal Value.
Is this allowed or not?
What rigorous restriction should I include in order to say that those functions are orthogonal?
 
Last edited:
Physics news on Phys.org
  • #2
The latter integral is zero if k is not zero (which means p is not q, and that you've stated before). But plugging k=0 gives you an infinite value (integration over 1). Which means, that wrt k, the expression behaves like a delta function.
The equation:

[tex]\int^{\infty}_{-\infty}f_{p}(x)\bar{f}_{q}(x)dx=\delta (p-q)[/tex]

For a set of a functions with a continuous parameter (p,q are reals) is the equivalent condition of orthonormality, which in sets with a discrete parameter looks like:

[tex]\int^{\infty}_{-\infty}f_{m}(x)\bar{f}_{n}(x)dx=\delta_{m,n}[/tex]
 
  • #3
Thanks a lot elibj123!
I still have few doubts about this issue:

1) When [itex]k\neq 0[/itex] the integral [itex]\int_{-\infty}^{+\infty}e^{ikx}dx[/itex] in general does not exist. It is zero only if we consider its Cauchy Principal Value. Is this commonly accepted?2) When [itex]k=0[/itex] the integral goes to infinity, so the "basis-functions" are not in [tex]L^2(\mathbb{R})[/tex]. However Plancherel's Theorem states that the Fourier Transform is an isometry [tex]L^2(\mathbb{R})\rightarrow L^2(\mathbb{R})[/tex]How is it possible that the complex sinusoids [itex]e^{ikx}[/itex] are basis-functions of the space [itex]L^2(\mathbb{R})[/itex] but at the same time they are not in this space? In a vector-space the basis vectors should always be contained in the space.
 
Last edited:
  • #4
mnb96 said:
Is it correct to say that the complex sinusoids form an orthogonal basis for this space?

It is not correct, at least not in the standard terminology of Hilbert spaces.

mnb96 said:
In a vector-space the basis vectors should always be contained in the space.

This remarks is correct.

What rigorous restriction should I include in order to say that those functions are orthogonal?

I don't know. It seems that nobody is really interested about coming up with such rigor definitions, which would allow these functions to be called orthogonal.

I have been left under impression, that the most important thing about this stuff is to understand how to prove the inverse Fourier transform with various kind of assumptions. Once you know how to prove that the inverse Fourier transform works, you can leave this orthogonality stuff on a heuristic level.

elibj123 said:
The latter integral is zero if k is not zero (which means p is not q, and that you've stated before). But plugging k=0 gives you an infinite value (integration over 1). Which means, that wrt k, the expression behaves like a delta function.

These are errorneous claims. Firstly the integral is not zero when [itex]k\neq 0[/itex], instead it as a divergent integral. Secondly one cannot prove that something behaves like a delta function simply because it is infinite at origo. The rate of divergence to the infinity matters very much.

It is true that if [itex]f[/itex] is continuous at origo, and has the bounded variation property (meaning that it can be written as a sum of two monotonous functions) in some environment of the origo, if [itex]a<0<b[/itex], and if [itex]f[/itex] is integrable over [itex][a,b][/itex], then

[tex]
\lim_{R\to\infty} \int\limits_a^b f(k)\Big(\int\limits_{-R}^R e^{ikx} dx\Big) dk = 2\pi f(0)
[/tex]

This mathematically rigorous result is one way of giving the mysterious delta function equation some meaning.

mnb96 said:
1) When [itex]k\neq 0[/itex] the integral [itex]\int_{-\infty}^{+\infty}e^{ikx}dx[/itex] in general does not exist. It is zero only if we consider its Cauchy Principal Value. Is this commonly accepted?

What do you mean by Cauchy principal value? Doesn't it normally mean that you somehow control the rate at which some parameters approach something? Like setting [itex](R_1,R_2)=(-R,R)[/itex] and [itex]R\to\infty[/itex], instead of simply [itex](R_1,R_2)\to (-\infty,\infty)[/itex]? I don't see how this would help here.
 
  • #5
2) When [itex]k=0[/itex] the integral goes to infinity, so the "basis-functions" are not in [tex]L^2(\mathbb{R})[/tex]. However Plancherel's Theorem states that the Fourier Transform is an isometry [tex]L^2(\mathbb{R})\rightarrow L^2(\mathbb{R})[/tex]

You are of course correct that the exponential functions are not in [itex]L^2(\mathbb{R})[/itex] (nor in [itex]L^1(\mathbb{R})[/itex] for that matter).

I would like to comment that the Fourier transform is indeed an isometry from [itex]L^2 \rightarrow L^2[/itex], but there are a lot of technicalities behind this statement.

You might correctly ask: if [itex]f \in L^2(\mathbb{R})[/itex], why does the defining integral for the Fourier transform of [itex]f[/itex], namely

[tex]\int_{-\infty}^\infty f(x) e^{-i\omega x}[/tex]

necessarily exist, i.e. does the Fourier transform even exist for all [itex]f \in L^2(\mathbb{R})[/itex]?

The answer is that it doesn't necessarily exist, at least not as defined by the above integral. In fact, since [itex]|f(x) e^{-i\omega x}| = |f(x)|[/itex] the integral exists if and only if [itex]f \in L^1[/itex]. So to define the Fourier transform on [itex]L^2[/itex] one has to use a limiting argument.

Essentially, one proves that for any [itex]f \in L^2[/itex], there exists a sequence of "nice" (in this case Schwartz) functions [itex]f_n \in L^2 \cap L^1[/itex] that converge to [itex]f[/itex] in [itex]L^2[/itex]. Since each [itex]f_n[/itex] is in [itex]L^1[/itex], the integral above defines the Fourier transform for [itex]f_n[/itex]. The "niceness" of the functions ensure that [tex]\hat{f}_n \rightarrow \hat{f}[/tex], where [tex]\hat{f} \in L^2[/tex] is then defined as the Fourier transform of [itex]f[/itex].

There are a lot of details to check, most importantly that [tex]\hat{f}[/tex] is well-defined (independent of which sequence [itex]f_n[/itex] you choose) and that it agrees with the standard definition when [itex]f[/itex] is in [itex]L^1 \cap L^2[/itex].

Good references for this include the following:

Jones, "Lebesgue Integration on Euclidean Space"
Rudin, "Real and Complex Analysis"
Stein and Shakarchi, "Real Analysis"

By the way, there IS a well-defined way to enlarge the class of functions to accommodate taking Fourier transforms of things like [itex]e^{i \omega x}[/itex], including giving rigorous definition to Dirac delta "functions" and the like. This is the theory of distributions. See, e.g.

Strichartz, "A Guide to Distribution Theory and Fourier Transforms"
Rudin, "Functional Analysis"
Hörmander, "The Analysis of Linear Partial Differential Operators Vol 1: Distribution Theory and Fourier Analysis"
 
  • #6
Personally, I own

Javier Duoandikoetxea, "Fourier Analysis".

I don't have extensive experience of different kind of Fourier Analysis books, so it could be that I'm not in a position where I should start recommending one over another, but anyway IMO Duoandikoetxea's book is good for those who are serious about mathematics. ("serious" means that you actually want to prove stuff with Fourier analysis, and not only repeat "it works!" when calculating...) For example, it actually tells how to use the assumption of bounded variation in some proofs (the secret is in certain way of using the second mean value theorem). Personally, I have increased my understanding about Fourier analysis a lot from this book, after initially being left confused by some physicists' crap.

This a book published by AMS, which is a reliable source in mathematics.
 

1. What is the definition of orthogonality in the context of sinusoids?

Orthogonality refers to the property of two sinusoids being perpendicular or at a right angle to each other when plotted on a graph. In other words, the two sinusoids have zero overlap or correlation with each other.

2. How is the orthogonality of sinusoids related to their frequencies?

The orthogonality of sinusoids is directly related to their frequencies. Two sinusoids with different frequencies are more likely to be orthogonal to each other compared to two sinusoids with similar frequencies. This is because the higher the frequency difference, the more likely the two sinusoids will have zero overlap in their waveforms.

3. What is the significance of orthogonality in signal processing?

Orthogonality is a crucial concept in signal processing because it allows us to decompose a complex signal into simpler, orthogonal components. This makes it easier to analyze and process the signal, as well as extract useful information from it.

4. Can two sinusoids with the same frequency be orthogonal?

No, two sinusoids with the same frequency cannot be orthogonal to each other. When plotted on a graph, they will have the same waveform and therefore have non-zero overlap, making them not orthogonal.

5. How is the orthogonality of sinusoids used in applications?

The concept of orthogonality is used in various applications such as signal processing, telecommunications, and even in music and sound engineering. In telecommunications, orthogonal frequency-division multiplexing (OFDM) uses the orthogonality of sinusoids to efficiently transmit multiple signals over the same frequency band. In music and sound engineering, orthogonal functions such as sine and cosine waves are used to create and manipulate sounds.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
345
Replies
6
Views
833
  • Linear and Abstract Algebra
2
Replies
43
Views
5K
  • Linear and Abstract Algebra
Replies
2
Views
955
  • Linear and Abstract Algebra
Replies
2
Views
903
  • Classical Physics
Replies
0
Views
100
Replies
6
Views
329
  • Linear and Abstract Algebra
Replies
4
Views
999
  • Quantum Physics
Replies
3
Views
2K
  • Introductory Physics Homework Help
Replies
1
Views
1K
Back
Top