woundedtiger4 said:
Hi all,
Can someone please give me few examples of a measure in context of probability, like in text it says:
A probability measure P over discrete set of events is basically what as know as a probability mass function.
We can let the text slide by since it says "is basically" instead of "is" Neither the probability density function nor the cumulative distribution function of a random variable are probability measures. They can be used to compute probability measures and perhaps it isn't much of an exaggeration to say that the probability density function "is basically" a measure.
Let U be the uniform distribution on the interval [0,1/2]. So the probability density function of U is f(x) = 2. A probability measure is defined as a function that assigns a number between 0 and 1 to each set in a sigma algebra of sets. I won't try to get into the technicalities of a sigma algebra. Let's just look at a particular set. Let s be the set consisting of the union of the intervals [0,1/8] and [3/8,1/2]. We can compute the probability that a realization of U will fall in the set s by computing \mu(s) = \int_0^{1/8} f(x) dx + \int_{3/8}^{1/2} f(x) dx This process assigns a number ( a probability) to the set s. For many other types of sets s, we can apply the same sort of process by integrating the probability density function over them. The "probability measure" is the function defined by this process. It is a function \mu(s) whose arguments is a
set , not a number. The function defined by this process is more general than the cumulative distribution function. The cumulative distribution function F(x) only assigns probabilities to sets of the form (-\infty, x]. The process \mu(s) involves using the probability density function, but it is not the same function as the probability density function.
For example given probability measure P and two sets A,B ε β, we can familiarly write
P(B\A) = P(A ∩ B)/P(A) .
I dont' know what that example has to do with the original question. It's actually very hard to define conditional probabilities in the context of measure theory.
is the probability mass function only measure that we deal in measure theory (in context of probability) or there exist more measures in probability?
thanks in advance.
There exist more general measures.
In elementary probability texts, you encounter two types of random variables. Those that take discrete values have "probability mass functions". Those that take a continuum of values have a "probability density function". Neither type of functions "is" a probability measure but both types can be used to define probability measures.
In the case of a discrete random variable, you define the process \mu(s) by summation instead of integration. For example let X be the random variable with PMF given by f(0) = 1/3, f(1) = 2/3 and f(x) = 0 otherwise. Let s be the set [-1,1/2]. To compute \mu(s) you add up all the non-zero values of f that occur in that interval (getting an answer of 1/3 in this example).
There are situations in elementary probability that involve random variables that are not purely discrete and not purely continuous. For example, define the random variable X as follows. Flip a fair coin. If the result is heads then X = 1/2. If the result is tails then set X equal to the realization of a uniform random variable on the interval [0,1]. If you want to know the probability that X falls in the interval s = [1/3,2/3], you can't get the right answer by doing the type of integral used in introductory calculus and you can't get the right answer by a summation. You need a combination of both methods. A commonsense person would compute the answer as \mu(s) =\frac{1}{2}( 1/2) + \frac{1}{2} \int_{1/3}^{2/3} 1 dx. If we grant that a commonsense person can find a procedure for every such set s in a sigma algebra of sets, then his method defines a probability measure.