## Examples of Measures in context of probability

Hi all,
Can someone please give me few examples of a measure in context of probability, like in text it says:
A probability measure P over discrete set of events is basically what as know as a probability mass function. For example given probability measure P and two sets A,B ε β, we can familiarly write
P(B\A) = P(A ∩ B)/P(A) .

is the probability mass function only measure that we deal in measure theory (in context of probability) or there exist more measures in probability?
 Recognitions: Science Advisor In probability theory continuous distributions are also studied. The subject involves anything concerning measure, where the total measure is one.
 Woundedtiger, I don't know your backgrounds in measure theory, but essentially, modern probability is whole measure-oriented (thanks to Kolmogorov). One exploits counting measures (e.g. probability of discrete random variables, pmfs,...), Lebesgue measures (probability of continuous random variables). Radon measure also represent probability with some special requirements. If you know the Radon-Nikodym theorem, then probability density (or mass) functions are measures defined as Radon-Nikodym derivatives of distributions wrt dominating measures (usually Lebesgue or counting).

Recognitions:
Gold Member
Staff Emeritus

## Examples of Measures in context of probability

Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.

 Quote by HallsofIvy Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.
Shouldn't there be 'cumulative distribution function' or 'distribution', if talking about antiderivative of pdf? pmf is (in a weak sense) a discrete counterpart of pdf and, if thinking of pdf in the Radon-Nikodym sense wrt dominating measure, pdf and pmf coincide, I think.
 Recognitions: Gold Member Science Advisor Staff Emeritus My understanding was that what woundedtiger4 was calling the "probability mass function" was what I (and you, apparently) would call the "cumulative distribution function".
 Well, I would expect that the conditional probability which woundedtiger4 expresses by his P(B|A) = ... is in terms of pdfs (pmfs) in their proper meaning, not cdfs. However, the OP is a bit unclear. Generally, I'm very afraid of nomenclature abuse and if this is the case, I'd suggest woundedtiger to review what pdf/cdf/pmf mean.

 Quote by HallsofIvy Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.
Do you mean that the probability density function is anti-derivative but I studied it in class of integration:

Please correct me if I am wrong.

 Quote by camillio Woundedtiger, I don't know your backgrounds in measure theory, but essentially, modern probability is whole measure-oriented (thanks to Kolmogorov). One exploits counting measures (e.g. probability of discrete random variables, pmfs,...), Lebesgue measures (probability of continuous random variables). Radon measure also represent probability with some special requirements. If you know the Radon-Nikodym theorem, then probability density (or mass) functions are measures defined as Radon-Nikodym derivatives of distributions wrt dominating measures (usually Lebesgue or counting).
Hi, I am self-taught, & have no formal guidance except I discuss my concept here on this great website or otherwise at openstudy, I am reading kolmogorov's intro to real analysis at the moment & so far I am only 2 chapters behind to finish the book, I haven't studied the radon-nikodym thm yet, hopefully in a week I will reach to this topic. Actually, I am trying to study measure theory in context of probability, I am also studying a book on probability by Henk Tjims but I am quite slow in reading it & still on first chapter from past 4 days, after reading PMF I found it quite interesting, and then I found a summary notes on measure theory on google in which it was written that PMF is a measure then I got much better understanding of Lebesgue integral (in context of it's domain, working procedure etc) therefore I was wondering if someone can share more measures (of probability) so that I can have strengthen up my basic concepts of this extremely interesting subject.
 Hi woundedtiger4, I like your approach :-) Well, there are several measures in the probability theory and related theories, e.g.:"ordinary" probability, i.e. measure defined on probability space equipped with sigma algebra of events; one often speaks about Baire measure if P is regular finite additive measure on a Borel sigma algebra of a topology space E, B(E). It's for instance of interest in stochastic processes. it coincides with Borel measure on metric spaces. Wiener measure is a measure associated with Brownian motion (aka Wiener process). Some other measures were given in previous posts. Those listed in this post are often variations on the theme of probability measure.

Recognitions:
 Quote by woundedtiger4 Hi all, Can someone please give me few examples of a measure in context of probability, like in text it says: A probability measure P over discrete set of events is basically what as know as a probability mass function.
We can let the text slide by since it says "is basically" instead of "is" Neither the probability density function nor the cumulative distribution function of a random variable are probability measures. They can be used to compute probability measures and perhaps it isn't much of an exaggeration to say that the probability density function "is basically" a measure.

Let U be the uniform distribution on the interval [0,1/2]. So the probability density function of U is f(x) = 2. A probability measure is defined as a function that assigns a number between 0 and 1 to each set in a sigma algebra of sets. I won't try to get into the technicalities of a sigma algebra. Lets just look at a particular set. Let s be the set consisting of the union of the intervals [0,1/8] and [3/8,1/2]. We can compute the probability that a realization of U will fall in the set s by computing $\mu(s) = \int_0^{1/8} f(x) dx + \int_{3/8}^{1/2} f(x) dx$ This process assigns a number ( a probability) to the set s. For many other types of sets s, we can apply the same sort of process by integrating the probability density function over them. The "probability measure" is the function defined by this process. It is a function $\mu(s)$ whose argument$s$ is a set , not a number. The function defined by this process is more general than the cumulative distribution function. The cumulative distribution function F(x) only assigns probabilities to sets of the form $(-\infty, x]$. The process $\mu(s)$ involves using the probability density function, but it is not the same function as the probability density function.

 For example given probability measure P and two sets A,B ε β, we can familiarly write P(B\A) = P(A ∩ B)/P(A) .
I dont' know what that example has to do with the original question. It's actually very hard to define conditional probabilities in the context of measure theory.

 is the probability mass function only measure that we deal in measure theory (in context of probability) or there exist more measures in probability? thanks in advance.
There exist more general measures.

In elementary probability texts, you encounter two types of random variables. Those that take discrete values have "probability mass functions". Those that take a continuum of values have a "probability density function". Neither type of functions "is" a probability measure but both types can be used to define probability measures.

In the case of a discrete random variable, you define the process $\mu(s)$ by summation instead of integration. For example let X be the random variable with PMF given by f(0) = 1/3, f(1) = 2/3 and f(x) = 0 otherwise. Let s be the set [-1,1/2]. To compute $\mu(s)$ you add up all the non-zero values of f that occur in that interval (getting an answer of 1/3 in this example).

There are situations in elementary probability that involve random variables that are not purely discrete and not purely continuous. For example, define the random variable X as follows. Flip a fair coin. If the result is heads then X = 1/2. If the result is tails then set X equal to the realization of a uniform random variable on the interval [0,1]. If you want to know the probability that X falls in the interval s = [1/3,2/3], you can't get the right answer by doing the type of integral used in introductory calculus and you can't get the right answer by a summation. You need a combination of both methods. A commonsense person would compute the answer as $\mu(s) =\frac{1}{2}( 1/2) + \frac{1}{2} \int_{1/3}^{2/3} 1 dx$. If we grant that a commonsense person can find a procedure for every such set s in a sigma algebra of sets, then his method defines a probability measure.

 Quote by Stephen Tashi We can let the text slide by since it says "is basically" instead of "is" Neither the probability density function nor the cumulative distribution function of a random variable are probability measures. They can be used to compute probability measures and perhaps it isn't much of an exaggeration to say that the probability density function "is basically" a measure. Let U be the uniform distribution on the interval [0,1/2]. So the probability density function of U is f(x) = 2. A probability measure is defined as a function that assigns a number between 0 and 1 to each set in a sigma algebra of sets. I won't try to get into the technicalities of a sigma algebra. Lets just look at a particular set. Let s be the set consisting of the union of the intervals [0,1/8] and [3/8,1/2]. We can compute the probability that a realization of U will fall in the set s by computing $\mu(s) = \int_0^{1/8} f(x) dx + \int_{3/8}^{1/2} f(x) dx$ This process assigns a number ( a probability) to the set s. For many other types of sets s, we can apply the same sort of process by integrating the probability density function over them. The "probability measure" is the function defined by this process. It is a function $\mu(s)$ whose argument$s$ is a set , not a number. The function defined by this process is more general than the cumulative distribution function. The cumulative distribution function F(x) only assigns probabilities to sets of the form $(-\infty, x]$. The process $\mu(s)$ involves using the probability density function, but it is not the same function as the probability density function. I dont' know what that example has to do with the original question. It's actually very hard to define conditional probabilities in the context of measure theory. There exist more general measures. In elementary probability texts, you encounter two types of random variables. Those that take discrete values have "probability mass functions". Those that take a continuum of values have a "probability density function". Neither type of functions "is" a probability measure but both types can be used to define probability measures. In the case of a discrete random variable, you define the process $\mu(s)$ by summation instead of integration. For example let X be the random variable with PMF given by f(0) = 1/3, f(1) = 2/3 and f(x) = 0 otherwise. Let s be the set [-1,1/2]. To compute $\mu(s)$ you add up all the non-zero values of f that occur in that interval (getting an answer of 1/3 in this example). There are situations in elementary probability that involve random variables that are not purely discrete and not purely continuous. For example, define the random variable X as follows. Flip a fair coin. If the result is heads then X = 1/2. If the result is tails then set X equal to the realization of a uniform random variable on the interval [0,1]. If you want to know the probability that X falls in the interval s = [1/3,2/3], you can't get the right answer by doing the type of integral used in introductory calculus and you can't get the right answer by a summation. You need a combination of both methods. A commonsense person would compute the answer as $\mu(s) =\frac{1}{2}( 1/2) + \frac{1}{2} \int_{1/3}^{2/3} 1 dx$. If we grant that a commonsense person can find a procedure for every such set s in a sigma algebra of sets, then his method defines a probability measure.
thanks a tonne for very detailed explanation.