Weird statement in my book about (measure theoretic) conditional expectation

AxiomOfChoice
Messages
531
Reaction score
1
My book tries to illustrate the conditional expectation for a random variable X(\omega) on a probability space (\Omega,\mathscr F,P) by asking me to consider the sigma-algebra \mathscr G = \{ \emptyset, \Omega \}, \mathscr G \subset \mathscr F. It then argues that E[X|\mathscr G] = E[X] (I'm fine with that). But it claims this should make sense, since \mathscr G "gives us no information." How is this supposed to make sense? In what regard does the sigma-algebra \mathscr G give us "no information" about X? I mean, if you know the values X takes on \mathscr G, you know X(\omega) everywhere, right?! So this obviously is the wrong interpretation (in fact, any sigma-algebra necessarily contains \Omega, so this interpretation would make conditional expectation useless) but I can't think of what the right one is...
 
Last edited:
Physics news on Phys.org
Think of a sigma algebra as 'containing information'.Since G is the trivial sigma algebra, it contains no intrinsic information & doesn't affect the expectation.
I must admit that this terminology is vague & nearly metaphorical. It's perfectly fine if you stash this terminology if it doesn't suit your intuition.
 
I don't know how the book you're following sets it out.

But consider discrete random variables X,Z and the E(X|Z=z) for distinct z's and how the sigma algebra generated by Z partions Omega. So consider first the functions measurable wrt to the trivial sigma algebra. Then a richer sigma algebra, and you might get more of a feel for the idea of "information" in the sigma algebra.

Even defining your random variables, Omega etc. and doing the calculations may make the idear clearer to you.
 
But what information is hidden if G is the trivial sigma algebra?
 
A lot? Potentially none - X might be G measurable.
 
What E[X\vert \mathcal{G}] means is that you know the information that X takes in \mathcal{G}.

So the clue is that E[X\vert \mathcal{G}] is \mathcal{G}-measurable. In fact, it is the \mathcal{G}-random variable that approximates X best (and this can be made rigorous).

So E[X\vert\mathcal{G}] is an approximation of X that is \mathcal{G}-measurable. So for any ]a,b[, we know that

\{E[X\vert\mathcal{G}]~\in ]a,b[\}\in \mathcal{G}\}

What happens if we have \mathcal{G}=\{\emptyset,\Omega\}, then we know that

\{E[X\vert\mathcal{G}]\in ]a,b[\}\in \{\emptyset,\Omega\}

But this places severe restrictions on E[X\vert\mathcal{G}]. In fact, it forces this random variable to be constant!

If we take \mathcal{G} to be finer (thus to contain more sets), then we allow E[X\vert \mathcal{G}] to take on more values. Specifically, we allow it to approximate X better.

For example, if \mathcal{G}=\{\emptyset,\Omega, G,G^c\}, then we must have\{E[X\vert\mathcal{G}]\in ]a,b[\}\in \{\emptyset,\Omega,G,G^c\}

This does not force our random variable to be constant. Indeed, we now allow E[X\vert\mathcal{G}] to take different values on G and Gc. So our random variable is now 2-valued!

The finer we make \mathcal{G}, the more variable the E[X\vert \mathcal{G}] can be. And the better the approximation can be!

I hope this helped.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Back
Top