Proving P(X_4|X_1,X_2)=P(X_4|X_2) for Markov Chain with Finite Possibilities

  • Context: Undergrad 
  • Thread starter Thread starter mathman
  • Start date Start date
  • Tags Tags
    Chain Markov chain
Click For Summary
SUMMARY

The discussion focuses on proving the relationship P(X_4|X_1,X_2) = P(X_4|X_2) within the context of a Markov chain with finite possibilities. The proof utilizes the properties of conditional probabilities and transition matrices, specifically that P(X_3|X_1,X_2) = P(X_3|X_2) and P(X_4|X_1,X_2,X_3) = P(X_4|X_3). The conclusion is reached through a series of equations demonstrating that the fourth state is conditionally independent of the first state when the second state is known. The discussion also emphasizes that the properties of Markov chains are essential to understanding this relationship.

PREREQUISITES
  • Understanding of Markov chains and their properties
  • Knowledge of conditional probability and independence
  • Familiarity with transition matrices in finite state processes
  • Basic grasp of probability density functions for continuous states
NEXT STEPS
  • Study the derivation of conditional independence in Markov chains
  • Explore the construction and application of transition matrices in Markov processes
  • Learn about the implications of continuous state Markov chains
  • Investigate counter-examples in probability theory to reinforce understanding of independence
USEFUL FOR

Mathematicians, statisticians, data scientists, and anyone interested in the theoretical foundations of Markov chains and their applications in probability theory.

mathman
Science Advisor
Homework Helper
Messages
8,130
Reaction score
574
TL;DR
Requirement for Markov chain.
Given events ##X_i## and the following: ##P(X_3|X_1,X_2)=P(X_3|X_2)## and ##P(X_4|X_1,X_2,X_3)=P(X_4|X_3)## prove ##P(X_4|X_1,X_2)=P(X_4|X_2)##?

Special case proof: Finite number of possibilities for each ##X_i##. Transition matrix from ##X_2## to ##X_4## is product of transitions from ##X_2## to ##X_3## to ##X_4##. Generalize?
 
Last edited:
Physics news on Phys.org
Are you sure this is true? It doesn't feel true. We can partition ##X=X_1\cup X_2\cup X_3\cup X_4## into 15 disjoint subsets ( ##=2^4 - 1##), and the only relationship between their probabilities is that they must be non-negative and sum to ##P(X)##. When we bring in the first two equations above, that still doesn't look like enough constraints on the 15 unknown probabilities to give us the third formula.

Given that, my approach would be to try to construct a counter-example by choosing values for the 15 disjoint components. If we find that we the third formula holds no matter how we change the 15 probabilities, the numbers should give us a clue as to why the relationship holds.

Note this does not use the properties of Markov Chains at all. I can't see how they would relate to the problem as stated.
 
andrewkirk said:
Are you sure this is true? It doesn't feel true. We can partition ##X=X_1\cup X_2\cup X_3\cup X_4## into 15 disjoint subsets ( ##=2^4 - 1##), and the only relationship between their probabilities is that they must be non-negative and sum to ##P(X)##. When we bring in the first two equations above, that still doesn't look like enough constraints on the 15 unknown probabilities to give us the third formula.

Given that, my approach would be to try to construct a counter-example by choosing values for the 15 disjoint components. If we find that we the third formula holds no matter how we change the 15 probabilities, the numbers should give us a clue as to why the relationship holds.

Note this does not use the properties of Markov Chains at all. I can't see how they would relate to the problem as stated.
The givens are the bare bones requirements for a sequence to be a Markov chain.
I couldn't follow you. Could you give the example?
 
mathman said:
The givens are the bare bones requirements for a sequence to be a Markov chain.
A Markov chain is defined as a process in which the probability of future events, as measured at time t, depends only on the state at time t. Markov chains can have continuous or discrete states and they can progress over continuous or discrete time. Finite state processes have finite transition matrices whereas continuous state processes have transition distributions.

How does a formula about four undefined events ##X_1,X_2,X_3,X_4##, with no context, have anything to do with that?
 
andrewkirk said:
A Markov chain is defined as a process in which the probability of future events, as measured at time t, depends only on the state at time t. Markov chains can have continuous or discrete states and they can progress over continuous or discrete time. Finite state processes have finite transition matrices whereas continuous state processes have transition distributions.

How does a formula about four undefined events ##X_1,X_2,X_3,X_4##, with no context, have anything to do with that?
Consider these to be the first four states in a Markov chain. The statements given are the third state depends only on the second, independent of the first, while the fourth state depends only on the third, independent of the first two.

The question is:, ignore the third state. Is the fourth state dependent only on the second, independent of the first?
 
Proof:

Let ##C_t## be the set of possible states at time ##t##.

\begin{align*}
P(X_4|X_1=x_1,X_2=x_2)
&=
\sum_{x_3\in C_3}
P(X_4|X_1=x_1,X_2=x_2, X_3=x_3)
P(X_3=x_3|X_1=x_1,X_2=x_2)
\\
&=
\sum_{x_3\in C_3}
P(X_4|X_3=x_3)
P(X_3=x_3|X_1=x_1,X_2=x_2)
\\ &\quad\quad\textrm{by first given equation}
\\
&=
\sum_{x_3\in C_3}
P(X_4|X_3=x_3)
P(X_3=x_3|X_2=x_2)
\\ &\quad\quad\textrm{by second given equation}
\\ &=
P(X_4=x_4|X_2=x_2)
\end{align*}

To cover the case of continuous states, replace the probabilities by probability density functions and the sums by integrals over the range ##C_3##.

To cover the case of continuous time, replace times 1, 2, 3, 4 by times ##0\le t_1\le t_2\le t_3\le t_4##,
 
This seems to be what I was looking for. For continuous states I think a proof is unnecessary, since the definition requires it.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 21 ·
Replies
21
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K