- #1
mikeyork
- 323
- 55
I'm an oldie and not well-versed in the modern formalism used in stochastic calculus, so please bear with me. I'm aware of Levy's characteristic function for stable distributions, though not well-versed in its practicalities.
I have read that for alpha=2 the stable distribution is Gaussian and also that the Gaussian is the only stable function with finite variance. However, I think I have found a pdf that is a rational function with finite variance and power law tails |x|^-4 and symmetric in x, for which an infinite recurrence of convolutions with itself appears to converge to another rational function with |x|^-4 tails. If so, then does this not imply convergence to a non-Gaussian stable distribution? And since the cumulative distribution will behave as |x|^-3 this implies alpha = 2 does it not?
On a similar note, I have read that the Central Limit Theorem implies that any distribution that is bounded converges to a Gaussian. Does bounded mean that all moments are finite? If we are always dealing with sampled data then of course all moments will be finite. But I have also read statements to the effect that any distribution with finite variance will converge to a Gaussian. But if we have a continuous distribution and make convolutions, as in my case, then it would seem reasonable that the limiting distribution need not be Gaussian if not all moments are finite!
Can anyone clear these issues up for me? Are cases like mine well known and understood? Is there any literature on them?
I have read that for alpha=2 the stable distribution is Gaussian and also that the Gaussian is the only stable function with finite variance. However, I think I have found a pdf that is a rational function with finite variance and power law tails |x|^-4 and symmetric in x, for which an infinite recurrence of convolutions with itself appears to converge to another rational function with |x|^-4 tails. If so, then does this not imply convergence to a non-Gaussian stable distribution? And since the cumulative distribution will behave as |x|^-3 this implies alpha = 2 does it not?
On a similar note, I have read that the Central Limit Theorem implies that any distribution that is bounded converges to a Gaussian. Does bounded mean that all moments are finite? If we are always dealing with sampled data then of course all moments will be finite. But I have also read statements to the effect that any distribution with finite variance will converge to a Gaussian. But if we have a continuous distribution and make convolutions, as in my case, then it would seem reasonable that the limiting distribution need not be Gaussian if not all moments are finite!
Can anyone clear these issues up for me? Are cases like mine well known and understood? Is there any literature on them?