# Independence of sigma algebras

• Arthur84
In summary, the conversation discusses the independence of random variables and sigma algebras, with a specific focus on the example of Bernoulli random variables. The main theorem relies on a lemma, which states that if a set of independent pi-systems are given, then their generated sigma algebras are also independent. However, the condition of the lemma is too strong and cannot be applied in the case of pairwise independent systems. The final part of the conversation raises a counterexample for the statement that if X_1 is independent of Y, then X_1, X_2, and X_3 are all independent.

#### Arthur84

I am trying to establish whether the following is true (my intuition tells me it is), more importantly if it is true, I need to establish a proof.

If $X_1, X_2$ and $X_3$ are pairwise independent random variables, then if $Y=X_2+X_3$, is $X_1$ independent to $Y$? (One can think of an example where the $X_i$ s are Bernoulli random variables, then the answer is yes, in the general case I have no idea how to prove it.)

A related problem is:

If $G_1,G_2$ and $G_3$ are pairwise independent sigma algebras, then is $G_1$ independent to the sigma algebra generated by $G_2$ and $G_3$ (which contains all the subsets of both, but has additional sets such as intersection of a set from $G_2$ and a set from $G_3$).

This came about as I tried to solve the following:
Suppose a Brownian motion $\{W_t\}$ is adapted to filtration $\{F_s\}$, if $0<s<t_1<t_2<t_3<\infty$, then show $a_1(W_{t_2}-W_{t_1})+a_2(W_{t_3}-W_{t_2})$ is independent of $F_s$ where $a_1,a_2$ are constants.

By definition individual future increments are independent of $F_s$, for the life of me I don't know how to prove linear combination of future increments are independent of $F_s$, intuitive of course it make sense...

Any help is greatly appreciated.

What you say is indeed true. A ful discussion can be found in Billingsley's "Probability and Measure" on page 50.

The proof relies on a lemma, which states that

Lemma: If $\mathcal{A}_1,...,\mathcal{A}_n$ are independent $\pi$-systems (=stable under finite intersections), then $\sigma(\mathcal{A}_1),...,\sigma(\mathcal{A}_n)$ are independent.

The proof is as follows:

Let $\mathcal{B}_i=\mathcal{A}_i\cup \{\Omega\}$ then we still have independent $\pi$-systems.
Take $B_2,...,B_n$ be fixed in $\mathcal{B}_2,...,\mathcal{B}_n$ respectively. Denote

$$\mathcal{L}=\{L\in \mathcal{F}~\vert~P(L\cap \bigcap{B_i})=P(L)\prod P(B_i)\}$$

This is a $\lambda$-system that contains $\mathcal{A}_1$. This implies that $\sigma(\mathcal{A_1})$ is independent from $\mathcal{A_2},...,\mathcal{A}_n$. Now we can proceed by induction.

Now we can prove the main theorem:

If $\mathcal{A}_i,i\in I$ are independent $\pi$-systems. If $I=\bigcup I_j$ is a disjoint union, then $\sigma(\bigcup_{i\in I_j} \mathcal{A}_i),j\in J$ are independent.

Proof: we put

$$\mathcal{C}_j=\{C~\vert~\exists K\subseteq I_j~\text{finite}, B_k\in \mathcal{A}_k, k\in K: C=\bigcap_{k\in K}{B_k}\}$$

Then $\mathcal{C}_j,j\in J$ are independent $\pi$-systems, and $\sigma(\mathcal{C}_j)=\mathcal{B}_j$. Now apply the lemma.

I was wondering, the condition of the lemma that $A_1, A_2,...,A_n$ are independent pi-systems is too strong, stronger than pairwise independent, not sure how to apply it in case of pairwise independent systems. In any case I will read the sections in Billingsley's book carefully.

Btw is there a quick explanation for:

A Brownian motion $\{W_t\}$ adapted to filtration $\{F_s\}$, if $0<s<t_1<t_2<t_3<\infty$, then $a_1(W_{t_2}-W_{t_1})+a_2(W_{t_3}-W_{t_2})$ is independent of $F_s$ where $a_1,a_2$ are constants.

Many thanks.

Pretty sure it isn't true. Suppose instead of Y = X_2 + X_3, we have Y = (X_2, X_3). If X_1 is independent of Y, then
P(X_1=x_1,X_2=x_2,X_3=x_3)) = P(X_1=x_1, Y = (x_2,x_3)) = P(X_1=x_1) P(X_2=x_2, X_3=x_3)
= P(X_1=x_1) P(X_2=x_2) P(X_3=x_3)
thus implying that X_1, X_2, and X_3 are all independent (not just pairwise). This isn't necessarily true.

So to disprove it, find 3 RVs that are pairwise independent but not all independent, and such that all of the pairwise sums of possible values of X_2 and X_3 sum to different values (so that X_2 + X_3 corresponds to (X_2, X_3)).

The independence of sigma algebras is a fundamental concept in probability theory that allows us to understand the relationship between different random variables and events. In the context of your question, we are interested in the independence of $X_1$ and $Y$, where $Y=X_2+X_3$.

To prove that $X_1$ is independent of $Y$, we need to show that the events involving $X_1$ and $Y$ are independent, meaning that the occurrence of one event does not affect the probability of the other event occurring.

In this case, we have $P(X_1=a, Y=b)=P(X_1=a, X_2+X_3=b)$. Using the definition of independence, we can rewrite this as $P(X_1=a)P(X_2+X_3=b)$. Since $X_1$ and $X_2, X_3$ are pairwise independent, we can further expand this as $P(X_1=a)P(X_2=b)P(X_3=b)$.

Therefore, we can see that the events involving $X_1$ and $Y$ are indeed independent, and thus $X_1$ is independent of $Y$.

Moving on to the related problem, we can use a similar approach to show that $G_1$ is independent of the sigma algebra generated by $G_2$ and $G_3$. We can express this as $P(A\cap B)=P(A)P(B)$, where $A\in G_1$ and $B\in \sigma(G_2, G_3)$.

Since $G_1, G_2, G_3$ are pairwise independent, we can rewrite this as $P(A)P(B_1)P(B_2)$, where $B_1\in G_2$ and $B_2\in G_3$. And since the sigma algebra generated by $G_2$ and $G_3$ contains all subsets of both $G_2$ and $G_3$, we can see that this holds for any event $B\in \sigma(G_2, G_3)$.

Finally, to solve the problem involving Brownian motion, we can use the same logic. We need to show that the linear combination of future increments is independent of \$F_s

## 1. What is the concept of independence of sigma algebras?

The independence of sigma algebras refers to the relationship between two or more sigma algebras, which are collections of sets that contain all possible outcomes of a given experiment or event. If two sigma algebras are independent, it means that the events in one sigma algebra do not affect the likelihood of events in the other sigma algebra occurring.

## 2. How is the independence of sigma algebras determined?

The independence of sigma algebras is determined by checking whether the intersection of the sets in one sigma algebra is independent from the intersection of the sets in the other sigma algebra. If the two intersections are independent, then the sigma algebras are independent.

## 3. What is the significance of independence of sigma algebras in probability theory?

The independence of sigma algebras is a fundamental concept in probability theory, as it allows for the calculation of joint probabilities and conditional probabilities. It also allows for the use of certain mathematical tools, such as the law of total probability and Bayes' theorem.

## 4. Can two sigma algebras be both independent and dependent at the same time?

No, two sigma algebras cannot be both independent and dependent at the same time. They are either independent or dependent, there is no in-between. If two sigma algebras are not independent, then they are dependent.

## 5. How does the independence of sigma algebras relate to the concept of independence of events?

The independence of sigma algebras is a generalization of the concept of independence of events. If two events are independent, then their corresponding singleton sets (sets containing only that event) are also independent sigma algebras. However, the converse is not always true, as two sigma algebras can be independent without their corresponding singleton sets being independent events.