Is $X_1$ Independent of $Y = X_2 + X_3$ Given Pairwise Independence?

AI Thread Summary
The discussion centers on whether $X_1$ is independent of $Y = X_2 + X_3$ given that $X_1, X_2, and X_3$ are pairwise independent random variables. While examples suggest that this independence may hold, a rigorous proof is sought, particularly in the context of pairwise independence versus full independence. A related inquiry involves the independence of sigma algebras generated by pairwise independent systems. The conversation also touches on the independence of linear combinations of future increments in Brownian motion relative to a filtration, with skepticism expressed about the general applicability of the lemma used in proofs. The conclusion emphasizes the need for careful consideration of independence definitions in these contexts.
Arthur84
Messages
2
Reaction score
0
I am trying to establish whether the following is true (my intuition tells me it is), more importantly if it is true, I need to establish a proof.

If $X_1, X_2$ and $X_3$ are pairwise independent random variables, then if $Y=X_2+X_3$, is $X_1$ independent to $Y$? (One can think of an example where the $X_i$ s are Bernoulli random variables, then the answer is yes, in the general case I have no idea how to prove it.)

A related problem is:

If $G_1,G_2$ and $G_3$ are pairwise independent sigma algebras, then is $G_1$ independent to the sigma algebra generated by $G_2$ and $G_3$ (which contains all the subsets of both, but has additional sets such as intersection of a set from $G_2$ and a set from $G_3$).

This came about as I tried to solve the following:
Suppose a Brownian motion $\{W_t\}$ is adapted to filtration $\{F_s\}$, if $0<s<t_1<t_2<t_3<\infty$, then show $a_1(W_{t_2}-W_{t_1})+a_2(W_{t_3}-W_{t_2})$ is independent of $F_s$ where $a_1,a_2$ are constants.

By definition individual future increments are independent of $F_s$, for the life of me I don't know how to prove linear combination of future increments are independent of $F_s$, intuitive of course it make sense...

Any help is greatly appreciated.
 
Physics news on Phys.org
What you say is indeed true. A ful discussion can be found in Billingsley's "Probability and Measure" on page 50.

The proof relies on a lemma, which states that

Lemma: If \mathcal{A}_1,...,\mathcal{A}_n are independent \pi-systems (=stable under finite intersections), then \sigma(\mathcal{A}_1),...,\sigma(\mathcal{A}_n) are independent.

The proof is as follows:

Let \mathcal{B}_i=\mathcal{A}_i\cup \{\Omega\} then we still have independent \pi-systems.
Take B_2,...,B_n be fixed in \mathcal{B}_2,...,\mathcal{B}_n respectively. Denote

\mathcal{L}=\{L\in \mathcal{F}~\vert~P(L\cap \bigcap{B_i})=P(L)\prod P(B_i)\}

This is a \lambda-system that contains \mathcal{A}_1. This implies that \sigma(\mathcal{A_1}) is independent from \mathcal{A_2},...,\mathcal{A}_n. Now we can proceed by induction.

Now we can prove the main theorem:

If \mathcal{A}_i,i\in I are independent \pi-systems. If I=\bigcup I_j is a disjoint union, then \sigma(\bigcup_{i\in I_j} \mathcal{A}_i),j\in J are independent.

Proof: we put

\mathcal{C}_j=\{C~\vert~\exists K\subseteq I_j~\text{finite}, B_k\in \mathcal{A}_k, k\in K: C=\bigcap_{k\in K}{B_k}\}

Then \mathcal{C}_j,j\in J are independent \pi-systems, and \sigma(\mathcal{C}_j)=\mathcal{B}_j. Now apply the lemma.
 
Thank you for the reply!
I was wondering, the condition of the lemma that $A_1, A_2,...,A_n$ are independent pi-systems is too strong, stronger than pairwise independent, not sure how to apply it in case of pairwise independent systems. In any case I will read the sections in Billingsley's book carefully.

Btw is there a quick explanation for:

A Brownian motion $\{W_t\}$ adapted to filtration $\{F_s\}$, if $0<s<t_1<t_2<t_3<\infty$, then $a_1(W_{t_2}-W_{t_1})+a_2(W_{t_3}-W_{t_2})$ is independent of $F_s$ where $a_1,a_2$ are constants.

Many thanks.
 
Pretty sure it isn't true. Suppose instead of Y = X_2 + X_3, we have Y = (X_2, X_3). If X_1 is independent of Y, then
P(X_1=x_1,X_2=x_2,X_3=x_3)) = P(X_1=x_1, Y = (x_2,x_3)) = P(X_1=x_1) P(X_2=x_2, X_3=x_3)
= P(X_1=x_1) P(X_2=x_2) P(X_3=x_3)
thus implying that X_1, X_2, and X_3 are all independent (not just pairwise). This isn't necessarily true.

So to disprove it, find 3 RVs that are pairwise independent but not all independent, and such that all of the pairwise sums of possible values of X_2 and X_3 sum to different values (so that X_2 + X_3 corresponds to (X_2, X_3)).
 
I'm taking a look at intuitionistic propositional logic (IPL). Basically it exclude Double Negation Elimination (DNE) from the set of axiom schemas replacing it with Ex falso quodlibet: ⊥ → p for any proposition p (including both atomic and composite propositions). In IPL, for instance, the Law of Excluded Middle (LEM) p ∨ ¬p is no longer a theorem. My question: aside from the logic formal perspective, is IPL supposed to model/address some specific "kind of world" ? Thanks.
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...

Similar threads

Replies
4
Views
2K
Replies
6
Views
4K
2
Replies
61
Views
11K
Replies
28
Views
6K
4
Replies
175
Views
25K
Back
Top