Dear all,(adsbygoogle = window.adsbygoogle || []).push({});

I have a problem in understanding how to bound a Gaussian distribution. LEt me describe the problem at hand: Let's say that we have a Gaussian distribution in the x-coordinate and a Gaussian distribution in the y-coordinate. Further, assume that the independent random variables x and y are defined from -infty to +infty. Therefore, if one finds the product of the marginal densities of x and y, the resultant joint distribution will have infinite support. If then one converts to the polar coordinate system so that x, y becomes r, phi it will imply directly that r is defined from 0 to infty.

This is what I am trying to avoid. I would like to define this distribution such that the random variable r is is lower bounded by some value, say r_min. In this respect, I would like to define x, y such that

[itex]x^2 + y^2 > r_{min}^2[/itex],

where [itex]r_{min}[/itex] denotes the minimum distance. I imagine this being a circular cap inside which the probability of finding a point is 0 and beyond which the random variable r is defined (and properly normalized).

My question essentially boils down to this. How does one achieve this? Is it by truncating the normal distribution (left truncation)?

Thanks in advance.

BR,

Alex

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Bounded/Truncated Gaussian distribution

**Physics Forums | Science Articles, Homework Help, Discussion**