About the definition of discrete random variable

Click For Summary
A discrete random variable (DRV) is defined as one that takes on a finite number of values within any finite interval, contrasting with the idea that it can take countably infinite values, such as rational numbers. While discrete data can include rational-number values, a DRV itself must draw from a finite set of outcomes, meaning it cannot possess an infinite number of data points. The discussion highlights that a DRV can have defined probabilities at countably many possible values, but these values must still be finite in nature. The cumulative distribution function (CDF) of a DRV increases only at specific points, which can include countably infinite jumps, but the overall set of values remains finite. Understanding the nuances of DRVs is essential for correctly applying probability theory.
Postante
Messages
5
Reaction score
0
About the definition of "discrete random variable"

Hogg and Craig stated that a discrete random variable takes on at most a finite number of values in every finite interval (“Introduction to Mathematical Statistics”, McMillan 3rd Ed, 1970, page 22).
This is in contrast with the assumption that discrete data can take on values that are countably infinite, in particular rational numbers (D.W. Gooch: “Encyclopedic Dictionary of Polymers”, App. E, page 980, Springer, 2nd Ed, 2010).
I would like to know if discrete random variables can – or can not – take on cuontably infinite values in a finite interval. Or, in other words, if the set of possible values of a discrete random variable may be the set of rational numbers.
 
Mathematics news on Phys.org


A DRV must be finite. You need to check the difference between a DRV and discrete data.

Notice - the set of rational numbers would be continuous rather than discrete but you can have discrete data that takes rational-number values.
However, you will never have an infinite number of those data points. The data set takes its values from a countably infinite set, it is not itself countably infinite.
A DRV may draw it's values from a set of discrete data. ie. it will always get it's values from a finite set.

To understand this - consider what sort of process generates discrete random data.
Also see:
http://www.stat.yale.edu/Courses/1997-98/101/ranvar.htm
 
Last edited:


Simon Bridge said:
A DRV must be finite. You need to check the difference between a DRV and discrete data.

Notice - the set of rational numbers would be continuous rather than discrete but you can have discrete data that takes rational-number values.
However, you will never have an infinite number of those data points. The data set _takes its values_ from a countably infinite set, it is not itself countably infinite.
A DRV may draw it's values from a set of discrete data. ie. it will always get it's values from a finite set.

To understand this - consider what sort of process generates discrete random data.
Also see:
http://www.stat.yale.edu/Courses/1997-98/101/ranvar.htm
I believe a RV taking only rational values would be discrete. I see the following quoted from Valerie J. Easton and John H. McColl's Statistics Glossary v1.1:
"A continuous random variable is not defined at specific values. Instead, it is defined over an interval of values."
A RV with countably many possible values must have defined probabilities at those values.

I don't see the relevance of your discussion of data sets to the OP.
 


Sure an RV taking rational values can be discrete.
OP brought up data sets in the original question with the reference to Gooch) though not in so many words. I'm not terribly happy with my attempt at a clarification of how Gooch and Hogg-n-Craig are not in conflict.

I like:
"A RV with countably many possible values must have defined probabilities at those values."
I had wondered if I should have included something like that - perhaps as a question:
... if a RV could take any rational number value in [0..1] then what would be the probability of getting a 0.5?

I suppose a better way to think of a DRV is that a probability can be assigned to particular outcomes.
 


An abstract from:
"pediaview.com/Probability_distributions"

... Equivalently to the above, a discrete random variable can be defined as a random variable whose cumulative distribution function (cdf) increases only by jump discontinuities—that is, its cdf increases only where it "jumps" to a higher value, and is constant between those jumps. The points where jumps occur are precisely the values which the random variable may take. The number of such jumps may be finite or countably infinite. The set of locations of such jumps need not be topologically discrete; for example, the cdf might jump at each rational number.

This means that the definition of DRV is not at all obvious!
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 66 ·
3
Replies
66
Views
7K
  • · Replies 37 ·
2
Replies
37
Views
10K