# Bounded/Truncated Gaussian distribution

1. Jan 22, 2012

### architect

Dear all,

I have a problem in understanding how to bound a Gaussian distribution. LEt me describe the problem at hand: Let's say that we have a Gaussian distribution in the x-coordinate and a Gaussian distribution in the y-coordinate. Further, assume that the independent random variables x and y are defined from -infty to +infty. Therefore, if one finds the product of the marginal densities of x and y, the resultant joint distribution will have infinite support. If then one converts to the polar coordinate system so that x, y becomes r, phi it will imply directly that r is defined from 0 to infty.

This is what I am trying to avoid. I would like to define this distribution such that the random variable r is is lower bounded by some value, say r_min. In this respect, I would like to define x, y such that

$x^2 + y^2 > r_{min}^2$,

where $r_{min}$ denotes the minimum distance. I imagine this being a circular cap inside which the probability of finding a point is 0 and beyond which the random variable r is defined (and properly normalized).

My question essentially boils down to this. How does one achieve this? Is it by truncating the normal distribution (left truncation)?

BR,

Alex

2. Jan 22, 2012

### Stephen Tashi

As I imagine your problem ( and , by the way, you'd get better advice if you actually described the real world scenario for the problem, if it has one), you want a distribution of points in the plane outside of some empty circle defined by $r_{min}$. You can't get that by truncating the two independent normal distributions for x and y since that could leave an empty square instead of a circle.

Let $f(x,y)$ be the joint density of the points, ignoring any restriction on r. Compute the probability that a point lands in the circle of radius $r_{min}$ by integrating $f(x,y)$ over that circle. Let's call that probability $p_0$. The density for the distribution of points when you exclude the possibility of them landing in the circle is $\frac{f(x,y)}{1-p_0}$.

If you want to know the marginal distribution for, say, y then you have to integrate $\frac{f(x,y)}{1-p_0}$ with respect to x. When the y value causes the integration to be on a line passing through the omitted circular region, you have to use the correct limits of integration for x so that the points in the circular region are not included.

3. Jan 22, 2012

### architect

Stephen,

thanks for your reply. Let me please describe my problem with a little more detail. My aim here is to distribute a set of points in 2D space under the Gaussian model and subsequently obtain the joint distribution in polar coordinates, i.e. r, phi. However, the problem is that the random variable needs to somehow be defined from a point onwards in the radial domain (r_min defined earlier). In other words, these set of Gaussian distributed points cannot lie within a distance less than r_min. The explanation of this restriction is due to the problem in consideration, which will take sometime to explain herein. Nonetheless, my aim is to derive a distribution such that these set of points lie from r_min to infty and not from 0 to infty and still be Gaussian distributed.

In your last post you propose to begin by computing the probability that a point lands in the circle of radius r_min by integrating f(x,y) over that circle. Do you mean by computing the probability as follows:

$P_o = \int_{0}^{r_{min}} \int_{0}^{2\pi}f(r,\phi)d\phi dr$ ??

Then, the density of points lying outside this circle would be:

$g(r,\phi) = \frac{f(r,\phi)}{1-P_o}$

The truth is that I do not quite understand how this new joint density is normalized? Is it?

Thanks a lot,

Alex

Last edited: Jan 22, 2012
4. Jan 22, 2012

### Stephen Tashi

You need the integrand to be $f(r,\phi) r d\phi dr$ to include the "area element" for polar coordinates, but yes, that's essentially what I mean. You can also do the integration in cartesian coordinates using variable limits of integration.

If you integrated $f(r,\phi)$ over all the plane except for the circle, instead of 1.0 , you'd get $1.0 - P_0$ due to the circle being left out Hence when the density is modified by dividing by a factor of $1 - P_0$ its integral over the whole plane except for the circle is $\frac{1 - P_0}{1 - P_0} = 1$

5. Jan 23, 2012

### architect

Stephen,

thanks once more for your time and reply. As I can understand what you propose is not far away from what I initially thought, but please correct me.

As mentioned in my first post, left-truncating the distribution will probably give us the desired result. A truncated distribution where just the bottom of the distribution has been removed is as follows:

$f(r \vert R>r_{min}) = \frac{f(r,\phi)}{1 - F(r_{min},\phi)}$, (1)

where $F(r_{min},\phi) = \int_0^{r_{min}}\int_0^{2\pi}f(r,\phi)r dr d\phi$ and denotes the CDF. Equation 1 seems to follow exactly what you proposed. Correct?

Best Regards,

Alex

6. Jan 23, 2012

### Stephen Tashi

Yes, that's correct.

(In your original post, you mentioned truncating a normal distribution. $f(r,\phi)$ will not be a normal distribution.)

7. Jan 23, 2012

### architect

Stephen,

is there any way to thank you and acknowledge your help in the forum?

Best Regards,

Alex

8. Jan 23, 2012

### Stephen Tashi

Thanks is thanks enough. Besides, I feel obligated. I come from a family of architects (father and one brother) - in case your user name is descriptive.