Can you explain the concept of random variables with no mean?

Click For Summary

Discussion Overview

The discussion revolves around the interpretation of random variables that do not possess a mean, specifically focusing on the Cauchy distribution and other distributions like the Pareto distribution. Participants seek intuitive explanations and visualizations to better understand the mathematical implications of such distributions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants explain that the Cauchy distribution lacks a mean due to its heavy tails, which prevent the integral defining the mean from converging.
  • Others mention that there is no requirement for a random variable to have finite moments, indicating that care must be taken when applying standard theorems.
  • A participant points out that while a sample mean can be calculated from a Cauchy distribution, the law of large numbers does not apply, and the average of multiple Cauchy variables retains the Cauchy distribution.
  • Another participant introduces the t-distribution with degrees of freedom greater than 2, noting that it has moments of order less than its degrees of freedom but no higher moments.
  • Questions arise regarding the interpretation of convergence and limits in the context of the mean, with a participant seeking clarification on whether the integral defining the mean cannot be computed or results in an arbitrarily large or small number.

Areas of Agreement / Disagreement

Participants express various viewpoints on the implications of having no mean in certain distributions, with some agreeing on the technical aspects while others raise questions about the underlying concepts. The discussion remains unresolved regarding the intuitive understanding of these mathematical properties.

Contextual Notes

Participants highlight limitations in understanding convergence and the behavior of integrals defining the mean and variance, emphasizing that these integrals must converge for the corresponding moments to exist.

ych22
Messages
114
Reaction score
0
Hello,

How do we interpret the fact that a random variable can have no mean? For example the Cauchy distribution, which arises from the ratio of two standard normal distributions.

I seek intuitive explanations or visualisations to understand math "facts" better.
 
Physics news on Phys.org
ych22 said:
Hello,

How do we interpret the fact that a random variable can have no mean? For example the Cauchy distribution, which arises from the ratio of two standard normal distributions.

I seek intuitive explanations or visualisations to understand math "facts" better.

I'm not quite sure what you mean? The Cauchy distribution does not have a mean because its tails are "too heavy" - the ends of the distribution do not decrease quickly enough for the integral that defines the mean to converge.
 
ych22 said:
Hello,

How do we interpret the fact that a random variable can have no mean? For example the Cauchy distribution, which arises from the ratio of two standard normal distributions.

I seek intuitive explanations or visualisations to understand math "facts" better.

Another example is the Pareto distribution for certain values of the tail parameter.

There's no requirement that a random variable has any finite moments. It just means that care needs to be taken when applying the usual theorems. For example, the mean of any finite sample is finite but since the law of large numbers no longer applies the mean does not converge to a finite value as the sample size increases. In fact, the average of 1000 Cauchy variables has the same distribution as a single Cauchy variable!
 
"There's no requirement that a random variable has any finite moments. It just means that care needs to be taken when applying the usual theorems. For example, the mean of any finite sample is finite but since the law of large numbers no longer applies the mean does not converge to a finite value as the sample size increases. In fact, the average of 1000 Cauchy variables has the same distribution as a single Cauchy variable!"

There is a slight risk that, if this is read too quickly, a confusion of terms and ideas will result.

1) If you take a sample of any size from a Cauchy distribution, you can calculate [itex]\bar x[/itex], so a sample mean is always defined

2) The convergence issue means this: since a Cauchy distribution does not have a mean it does not have any higher-order moments: in particular, no variance. The lack of a population means that

a) The common statement that [itex]\bar X \to \mu[/itex] in probability does not apply

b) The CLT result, that the asymptotic distribution of the sample mean is a particular normal distribution, does not apply

c) It can be shown that, mathematically, if [itex]X_1, X_n, \dots, X_n[/itex] are i.i.d Cauchy, the distribution of [itex]\bar X[/itex] is still Cauchy - not , as we are used to seeing, approximately normal

bpet's initial comment that "there is no requirement that a random variable has any finite moments; it just means that care needs to be taken when applying the usual theorems" is spot on.

As another odd example, consider a [itex]t[/itex] distribution with [itex]n[/itex] degrees of freedom. For any particular [itex]n > 2[/itex] the distribution has moments of order [itex]k < n[/itex] but no others.
 
Thanks statdad, I will settle for your answer.

I am not too familiar with convergence and limits theoretically to understand how "the ends of the distribution do not decrease quickly enough for the integral that defines the mean to converge" though.

Is it equivalent to "The integral that defines the mean cannot be computed" or "The integral that defines the mean would result in an arbitarily large/small number"?
 
ych22 said:
Thanks statdad, I will settle for your answer.

Perhaps simply remembering that while probability does a good job of describing the world around us, it has mathematics as its underpinning and there are some instances where our intuition will cause us to expect things that don't occur. Happens to everyone.
I am not too familiar with convergence and limits theoretically to understand how "the ends of the distribution do not decrease quickly enough for the integral that defines the mean to converge" though.

Is it equivalent to "The integral that defines the mean cannot be computed" or "The integral that defines the mean would result in an arbitarily large/small number"?

For continuous distributions the mean and variance are, by definition, these integrals.

[tex] \mu = \int_{-\infty}^\infty x f(x) \, dx, \quad \sigma^2 = \int_{-\infty}^\infty (x-\mu)^2 f(x) \, dx[/tex]

For either of these to exist the corresponding integral must exist. If the integral doesn't converge, the integrand (either [itex]x f(x)[/itex] or [itex](x-\mu)^2 f(x)[/itex])
doesn't go to zero fast enough as [itex]x \to \pm \infty[/itex]. That's all I meant.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K