Showing the uniform convergence of a gaussian function-like series

dane502
Messages
20
Reaction score
0

Homework Statement



Prove that the series \sum_{n=0}^\infty e^{-n^2x^2} converges uniformly on the set \mathbb{R}\backslash\ \big] -\epsilon,\epsilon\big[ where \epsilon>0

Homework Equations


n/a

The Attempt at a Solution


I have tried using Weierstrass M-test but I have not been able to find a suitable series.
As my topic implies I thought I could use some a series of a bell-curve like function, which I have some experience with from a probability course.
But I have a problem finding a suitable expression - let alone showing that this series convergence.

I would appreciate if someone could help getting me started (preferably without solving the entire exercise).
 
Physics news on Phys.org
Try finding a function > exp(-n^2 * x^2) for every n and whose integral over R is finite.
 
dane502 said:

Homework Statement



Prove that the series \sum_{n=0}^\infty e^{-n^2x^2} converges uniformly on the set \mathbb{R}\backslash\ \big] -\epsilon,\epsilon\big[ where \epsilon>0


Homework Equations


n/a


The Attempt at a Solution


I have tried using Weierstrass M-test but I have not been able to find a suitable series.
As my topic implies I thought I could use some a series of a bell-curve like function, which I have some experience with from a probability course.
But I have a problem finding a suitable expression - let alone showing that this series convergence.

I would appreciate if someone could help getting me started (preferably without solving the entire exercise).

Well, since e^{-n^2x^2} decreases as |x| increases you have

e^{-n^2x^2}\le e^{-n^2\epsilon^2}

if |x| ≥ ε. Now since negative exponentials decay fast, you should be able construct a comparison with Σ1/n2.
 
Got it! Thank you very much, LCKurtz.

For my personal interest, would someone care to comment on whether or not the sets
<br /> \mathbb{R}\backslash\ \left\{ 0 \right\}<br />

and
<br /> \mathbb{R}\backslash\ \left] -\epsilon,\epsilon \right[ <br />
where \epsilon&gt;0, are the same?
 
dane502 said:
Got it! Thank you very much, LCKurtz.

For my personal interest, would someone care to comment on whether or not the sets
<br /> \mathbb{R}\backslash\ \left\{ 0 \right\}<br />

and
<br /> \mathbb{R}\backslash\ \left] -\epsilon,\epsilon \right[ <br />
where \epsilon&gt;0, are the same?

No, they aren't. ε/2 is in the first but not the second.
 
That is a convincing argument, although I have a hard time visualizing the difference between the two sets, when ε→0..
With regards to the original topic, I have another question. We have shown that the series converges uniformly, which is all I needed to show, but i would also like to know to WHAT it converges, ie. the limit. Would someone care to comment on that?
 
If I may add another question to the above, are the two sets equal for ε→0?
 
dane502 said:
If I may add another question to the above, are the two sets equal for ε→0?

For ε→0 the sets are the same. But if only the only requirement is ε>0, then LCKurtz's argument stands. I don't know how to calculate the value of the limit, but try using Maple, Mathematica etc.
 
I only have maple, and it is unable to evaluate the sum. Does anybody have another idea?
 
  • #10
dane502 said:
I only have maple, and it is unable to evaluate the sum. Does anybody have another idea?

I don't think you are likely to find a formula for the sum. Probably the best you could hope for is that it is a common enough sum that it has been given a name and its properties have been studied as, for example, Bessel functions have. Or it may be just another convergent series with a no-name sum.
 
Back
Top