- #1
architect
- 31
- 0
Hi,
I am wondering what happens to a Gaussian distribution when its mean tends to infinity. By looking at the equation of a Gaussian one might infer that the limit will go to zero; but does this imply that it reduces to a Dirac-Delta function? More precisely, if we integrate this limit we obtain 0, which suggests that this is a Dirac-Delta function. However, I am not convinced that it actually is since by increasing the mean the standard deviation should not change.
At least we are certain that a Gaussian approximates a Dirac-Delta when its standard deviation \sigma goes to zero.
Thanks
Alex
I am wondering what happens to a Gaussian distribution when its mean tends to infinity. By looking at the equation of a Gaussian one might infer that the limit will go to zero; but does this imply that it reduces to a Dirac-Delta function? More precisely, if we integrate this limit we obtain 0, which suggests that this is a Dirac-Delta function. However, I am not convinced that it actually is since by increasing the mean the standard deviation should not change.
At least we are certain that a Gaussian approximates a Dirac-Delta when its standard deviation \sigma goes to zero.
Thanks
Alex
Last edited: