Convergence in the sense of distributions

In summary: What is the Fourier transform of a summable function?The Fourier transform of a summable function is the transform of the function that goes to zero faster than any inverse power.
  • #1
QuArK21343
47
0
I have the following problem: prove that the sequence [itex]e^{inx}[/itex] tends to [itex]0[/itex], in the sense of distributions, when [itex]n\to \infty[/itex]. Here it is how I approached the problem. I have to prove this:

[tex]\lim \int e^{inx}\phi(x)\,dx=0[/tex]

, where [itex]\phi[/itex] is a test-function. I changed variable: [itex]nx=x'[/itex] and got:

[tex]\lim \frac{1}{n}\int e^{ix'}\phi(x'/n)dx'[/tex]

Now, can I exchange limit and integral? I would say yes, because of dominated convergence: the absolute value of the integrand is less than, say, [itex]c|\phi|[/itex], which is summable and the limit of the integrand exists, because [itex]\phi[/itex] is continuos. So,

[tex]\phi(0)\lim \frac{1}{n}\int e^{ix'}dx'[/tex]

But can I say that this last limit is zero? I mean, shouldn't the limit function be summable, again by dominated convergence? I suspect that I have a wrong understanding either of convergence in the sense of distribution or of dominated convergence. Can you clear up my doubts?
 
Physics news on Phys.org
  • #2
The idea is right. But I don't quite understand why you use things like "summable". Can you tell us what you mean with dominated convergence exactly?
 
  • #3
What are the limits of integration? ∫eixdx = 0 when integrating over an interval of length 2π.
 
  • #4
micromass said:
But I don't quite understand why you use things like "summable".
I think summable is an older term for integrable.
 
  • #5
lugita15 said:
I think summable is an older term for integrable.

Oh, that would make sense actually.
 
  • #6
Ok, by summable I mean that the integral over all space exists and it is finite. I don't know if the same terminology is used in english. Also, I didn't write the limits of integration, but they are over all space (let's say over all R or R^N). The dominated converge theorem I have studied says that if a sequence of measurable functions converges point-wise and the modulus of these functions (indexed by n) is dominated by a nonnegative summable function I can exchange limit and integral and the limit function is automatically summable:

[tex]\lim\int f_n dx=\int f dx[/tex]

if [itex]|f_n|<[/itex] some summable function (at least almost everywhere etc). Is this correct? In my part of the world it is late in the night, so I will check for answers tomorrow morning. Thanks again for your help!
 
  • #7
In that case, everything looks ok.
 
  • #8
What worries me is that the last integral I get (the integral of the complex exponential) does not exists, does it? Shouldn't the limit function be summable?
 
  • #9
Note that the test function [itex]\varphi[/itex] is nonzero on a compact domain K. So what you're actually doing is calculating

[tex]\lim_{n\rightarrow +\infty} \int_K e^{inx}\varphi(x)dx[/tex]

If you work from this then you end up with

[tex]\int_K e^{ix}[/tex]

which does exist.
 
  • #10
I see, but my definition of a test-function is an infinitely differentiable function that goes to zero faster than any inverse power. Is this equivalent to saying that it has a compact support?
 
  • #11
QuArK21343 said:
I see, but my definition of a test-function is an infinitely differentiable function that goes to zero faster than any inverse power. Is this equivalent to saying that it has a compact support?

No. But it goes faster than zero than what?? Inverse powers of polynomials??

In that case, applying dominated convergence does not seem correct here. You say that it is smaller than [itex]c|\varphi|[/itex]. But I don't see how that can be true. I get

[tex]\left| \int e^{ix}\varphi(x/n)dx\right| \leq \int |\varphi(x/n)dx|=n\int |\varphi(u)du|=n|\varphi|[/tex]

But this is dependent of n and is not bounded by a summable function.
 
  • #12
They must be [itex]o(||x||^{-m})[/itex] but not only that, also all their derivatives must satisfy the same condition (I was in a hurry this afternoon). And yes, you are right, my estimate is clearly wrong. My new attempted solution is this: just notice that it is a limit as [itex]n\to \infty [/itex] of the Fourier transform [itex]\hat \phi(n) [/itex]. By a lemma of Riemann and Lebesgue, the Fourier transform of a summable function vanishes at infinity.
 
  • #13
The only problem is that the Fourier coefficients are for ##L1## functions on the circle. For the infinite domain case you need to appeal to the monotone convergence theorem, approximating from compact intervals.
 

1. What is convergence in the sense of distributions?

Convergence in the sense of distributions refers to a type of convergence in probability theory where a sequence of random variables converges to a limiting distribution. It is also known as weak convergence or convergence in distribution.

2. How is convergence in the sense of distributions different from other types of convergence?

Convergence in the sense of distributions is different from other types of convergence, such as almost sure convergence or convergence in mean, because it does not require the underlying probability space to remain the same. Instead, it only requires that the distributions of the random variables converge to the same limiting distribution.

3. What is the significance of convergence in the sense of distributions in probability theory?

Convergence in the sense of distributions is important in probability theory because it allows us to study the behavior of a sequence of random variables without having to fully understand the underlying probability space. It also allows us to generalize many results from other types of convergence to a wider range of random variables.

4. What are some common examples of convergence in the sense of distributions?

Some common examples of convergence in the sense of distributions include the central limit theorem, which states that the sum of a large number of independent and identically distributed random variables converges to a normal distribution, and the law of large numbers, which states that the sample mean of a large number of independent and identically distributed random variables converges to the population mean.

5. What are the conditions for convergence in the sense of distributions to occur?

The conditions for convergence in the sense of distributions to occur vary depending on the specific theorem or result being used. However, in general, the random variables in the sequence must be defined on the same probability space, and there must be some notion of "closeness" between the distributions of the random variables as they approach the limiting distribution.

Similar threads

  • Topology and Analysis
Replies
3
Views
965
Replies
8
Views
1K
  • Topology and Analysis
Replies
4
Views
2K
  • Topology and Analysis
Replies
11
Views
1K
Replies
2
Views
1K
  • Topology and Analysis
Replies
6
Views
113
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
217
  • Topology and Analysis
Replies
4
Views
253
  • Calculus and Beyond Homework Help
Replies
4
Views
263
Back
Top