Register to reply 
Measurability of random variables 
Share this thread: 
#1
Apr1512, 11:34 AM

P: 59

Ive been working with random variables for a while and only today have I come up with a basic question that undermines what I thought I knew...
If I have two random variables X and Y, when am I allowed to multiply them? i.e. Z=XY Let S_1 and S_1 be sigma algebras such that S_1 is contained in S_2 Cases i) X and Y are both S_1 measurable It seems clear that Z=XY exists and is also S_1 measurable ii) X is S_1 measurable and Y is S_2 measurable In this case X is also S_2 measurable, but Y is not S_1 measurable. (Am I correct to say this?) Can we form Z=XY and if so does Z simply become S_2 measurable? iii) Assume S_3 is not a subset of either S_1 or S_2 Can we write Z=XY? Thanks for your help 


#2
Apr1512, 12:59 PM

P: 2,501




#3
Apr1612, 10:52 AM

Sci Advisor
P: 3,300




#4
Apr1612, 12:36 PM

P: 2,501

Measurability of random variables



#5
Apr1612, 03:21 PM

Sci Advisor
P: 6,077

In all my courses in probability theory I have never encountered a situation involving more than one random variable where they were not defined on the same probability space, with the same sigma algebra.



#6
Apr1712, 10:27 AM

Sci Advisor
P: 3,300

"Contained" is an ambiguous word in many contexts, but perhaps that's what we should say since the sigma algebra is a collection of sets rather than a set of sets. I think so, in spirit, but in addition to technicality that you should be talking about the measurability with respect to measures instead of sigma algebras, there is the technicality that X restricted to smaller domain is not the same function as X. The function Y isn't measureable "on S_1" merely because it isn't defined there. This raises the interesting question of whether there is a unique extension of Y that is. I don't know the answer to that. I think you can define a "product measure" on tuples of sets, each taken from a different sigma algebra. So if X is [itex] \mu [/itex] measureable on S_1 and Y is [itex] \mu_3 [/itex] measureable on S_3 then you can implement the idea of an independent realization of X and Y by taking  what should I say? The cartesian product of S_1 with S_3? That terminology may only apply to sets, but you get the idea. You can define a product measure [itex] \mu\mu_3 \ [/itex] on that collection of ordered pairs of sets. If you are trying to deal with a situation where there is a dependence between X and Y then you have to say more about what relates them before we can make progress. 


#7
Apr1712, 03:19 PM

P: 2,501




#8
Apr1712, 06:16 PM

Sci Advisor
P: 3,300

You can hardly state any statistical problem of moderate complexity without getting into several different probability spaces. For example, the probability space for 5 independent random draws from a normal distribution is not the same as probability space for 1 random draw from a normal distribution. 


#9
Apr1712, 10:55 PM

P: 2,501

A probability space consists of the triple of 1)a sample space of outcomes, 2)a sigma algebra over an event space where zero or more outcomes are associated with each event and 3) a probability measure assigned to each event. Outcomes can be complex. The sequence HHTHTTHHTH is am outcome. In fact, any set of events associated with any number of coin tosses will have a total probability which cannot exceed one in a single experiment. You can define multiple probability spaces to correspond to multiple experiments, but you can also partition a single probability space to correspond to a combined set of experiments when such aggregation makes sense. 


#10
Apr1712, 11:06 PM

P: 4,573

The only thing though is you have a nonzero probability of something corresponding to +infinity or infinity where the other variable has a nonzero probability of being nonzero. This is the only thing that wouldn't make sense since usually we have to have something finite unless you specifically define characteristics that make sense of nonfinite realizations of your random variables. Also I'm assuming that X and Y are just real numbers when they are realized. The probability space for Z will simply be the Cartesian product for the spaces X and Y in the way that an event for Z will depend on each realization of X with a realization of Y in the same way that we generate [0,1]x[0,1] for the Cartesian product. Note that I am talking about event generation and not probability generation for the events: this will depend on the distribution itself and things like whether there are any dependencies between X and Y. 


#11
Apr1812, 12:05 AM

Sci Advisor
P: 3,300

You can form a set that is the product of 5 sets. You can form the product sigma algebra of 5 sigma algebras and you can form the product measure of 5 measures. (Most practical probability books don't treat the measure theory aspects of probability rigorously so they don't bother with such things.) 


#12
Apr1812, 12:46 AM

Sci Advisor
P: 3,300

A "random variable" in measure theory is a function from some set to the real numbers. There is some sigma algebra you care about on the real numbers. The random variable has the property that if you take the inverse image of any set in that sigma algebra under the random variable, it will be a set in another sigma algebra in the domain of the random variable that you also care about and that set will be measureable by the measure defined on that sigma algebra. So a "random variable" in measure theory can't be defined without reference to a measure (and the other things). (It's ironic that I'm playing the role of measure theory person. It isn't the way I think about probability theory and I was never any good at measure theory. Maybe this is penance.) So what plays the role of a "probability density function" in measure theory? I wish a real measure theory expert would tell us this. Basically, a "measure" is a function that defines a type of abstract integration. A density (in ordinary probability theory) is the derivative of a particular kind of integral. So I think the analog of a probability density function would be a function that was, in some sense, a derivative of a measure. This is called a "RadonNikodyn derivative". I don't know whether the4thamigo_uk is is interested in this or whether he wants to step back from the measure theory cliff. 


#13
Apr1812, 01:56 AM

P: 4,573

Provided that you have the right measures and that the values for each 'distribution' are finite, then wouldn't you still get a final 'distribution' that satisfies the axioms and produces finite results?



#14
Apr1812, 10:40 AM

Sci Advisor
P: 3,300

The way ordinary probability texts sidestep measure theory is to use specific methods of integration. They use Riemann (or similar) integrals for continuous random variates and for discrete distributions they use summation. From the point of view of measure theory, both of these methods are the beginnings of measures. It is easy to invent examples of random variates that aren't purely continuous or discrete. For example, define the random variable X (in practical terms) as follows. Flip a fair coin. If the coin lands heads then X = 1. If the coin lands tails then let X be the result of a draw from a uniform random variable u on the interval [0,2]. Practical people know how to handle the distribution of X through a mixture of Riemann integration and summation, but you can't write a simple exposition of a theory of distribution functions and densities that handles this type of situation unless you get into forms of integration and differentiation that are more general than Riemann integration and summation. If we look at a simple definite integral from calculus [itex] \int_a^b f(x) dx [/itex], we can pretend f(x) is "given" and the definite integral can be regarded as a function whose domain is the collection of sets of the form [a,b] and whose range is the real numbers. The reason it is isn't a measure on the real numers is that it doesn't produce an answer on all the sets in a sigma algebra on the real numbers. You have to struggle to extend the definition of integral in order to get results on all the wierd sets than can crop up in a sigma algebra. My education went from Riemann integration to measure theory with only a brief stop at the RiemannStieltjes integral, but I think that type of integration is one way of handling the mixture of continuous and discrete random variates. The outlook of measure theory is "Let's assume I've solved all the integration theory. We aren't going to worry about how I did it, or whether there is any underlying function f(x) that I'm integrating over this collection of sets, or whether I'm using a mixture of Riemann integration and summation. We'll assume I have a measure, so if you give me a set in the sigma algebra then I can assign it a number and the way this function behaves on the sets resembles the way that simple theories of integration behave on the sets they can deal with." If you want to go from measure theory to probability measures to something resembling probablity densities or cumulative distributions, you need more theoretical machinery. My point is that densities and distributions are not "builtin" to the basics of measure theory. A measure is like a "black box" process. You can speculate that it comes from integrating a specific function by using a specific method of integration, but nothing in the definition of measure guarantees that this is how it operates. 


Register to reply 
Related Discussions  
A question in random variables and random processes  Set Theory, Logic, Probability, Statistics  1  
Real Variables: Measurability of {x: x∈An i.o.}  Calculus & Beyond Homework  2  
Random Variable Measurability w.r.t. Sigma Fields  Set Theory, Logic, Probability, Statistics  16  
Proof of the statement: sum of two random variables is also a random variable  Set Theory, Logic, Probability, Statistics  8  
Expectation and variance of a random number of random variables  Calculus & Beyond Homework  3 