Proving Irreducible Markov Chains not Martingales

  • Thread starter Edwinkumar
  • Start date
In summary, an irreducible Markov Chain on a finite state space {0, 1, ..., m} cannot be a Martingale. This is because for it to be a Martingale, the expected value of the next state given the current state must be equal to the current state. However, if we consider the case where the current state is 0, this would require P(S_{n+1}=0|S_n=0)=1, which violates the assumption of irreducibility. This is not the case for Markov Chains with infinite state spaces, as it is possible to construct non-trivial conditional distributions that satisfy the Martingale condition. An example of this is the Random Walk, where the transition
  • #1
Edwinkumar
23
0
Can someone prove that an irreducible markov chain on a finite state space {0,1,...,m} is not a Martingale?
 
Physics news on Phys.org
  • #2
Well, if [itex]S_n[/itex] is some irreduceable Markov chain with finite state space. For it to also be a Martingale would require [itex]E(S_{n+1}|S_n) = S_n[/itex]. Consider the case where [itex]S_n = 0[/itex]. Then the Martingale condition would be [itex]E(S_{n+1}|S_n=0) = 0[/itex], which would require that [itex]P(S_{n+1}=0|S_n=0)=1[/itex], which violates the assumption of irreducibility. So, an irreducible Markov Chain with finite state space cannot be a Martingale.

Notice that this is not the case for Markov Chains with infinite state spaces. Since there is no "edge" to the state space, it's easy to construct non-trivial conditional distributions with the required expected values, which then gives an irreducible chain. Can you think of an example?
 
Last edited by a moderator:
  • #3
Thank you very much for your reply.
quadraphonics said:
Consider the case where [itex]S_n = 0[/itex]. Then the Martingale condition would be [itex]E(S_{n+1}|S_n=0) = 0[/itex], which would require that [itex]P(S_{n+1}=0|S_n=0)=1[/itex], which violates the assumption of irreducibility.

Why do you assume that [itex]S_n = 0[/itex]. Moreover why is [itex]P(S_{n+1}=0|S_n=0)=1[/itex]?

quadraphonics said:
Since there is no "edge" to the state space, it's easy to construct non-trivial conditional distributions with the required expected values, which then gives an irreducible chain. Can you think of an example?

I don't understand what do you mean by 'no "edge" to the state space'.
 
  • #4
I'm confused too. I understand the Martingale condition, but why must that imply [itex]P(S_{n+1} = 0| S_{n} = 0) = 1[/itex]?

If [itex]P(S_{n+1} = -1 | S_{n} = 0) = 0.5[/itex] and [itex]P(S_{n+1} = 1 | S_{n} = 0) = 0.5[/itex] then the Martingale condition still holds because the expected value is still 0. Is this right or am I missing something?
 
  • #5
Boxcar Billy said:
If [itex]P(S_{n+1} = -1 | S_{n} = 0) = 0.5[/itex] and [itex]P(S_{n+1} = 1 | S_{n} = 0) = 0.5[/itex] then the Martingale condition still holds because the expected value is still 0. Is this right or am I missing something?

The issue is that [itex]S_{n+1}=-1[/itex] is not in the state space, which, remember, consists of [itex]\{0, ... , m\}[/itex]. If the state-space is infinite, then your approach would always work, as there would always be valid parts of the state space both above and below the current state. But for a finite state space, it's impossible to construct non-trivial conditional distributions for [itex]S_n=0[/itex] and [itex]S_n=m[/itex] that satisfy the Martingale condition.
 
  • #6
BTW, the Markov Chain with countable state space and transition probability [itex]P(S_{n+1}=s+1|S_n = s) = P(S_{n+1}=s-1|S_n = s)=1/2[/itex] is the (discrete, symmetric) Random Walk, which is a classic example of a martingale.
 
  • #7
Mr.quadraphonics, I have just started learning Martingales in the classical way(i.e. measure theoretic).
The definition for a sequence of integrable random variables [itex]S_n[/itex] to be a Martingale with respect to a filtration [itex]\mathcal{F}_n[/itex], if (1) [itex]S_n[/itex] is [itex]\mathcal{F}_n[/itex] measurable and (2) [itex]E[S_{n+1}|\mathcal{F}_n]=S_n[/itex].

My questions to you are the following:
1) How can you assume that [itex]S_n=0[/itex]?
2) How can you condition [itex]S_n[/itex] instead of [itex]\mathcal{F}_n[/itex]?
3) Moreover, can you define some stopping time [itex]\tau[/itex] so that the stopped process is a Martingale?

Thank you very much for all your replies.
 
  • #8
Edwinkumar said:
1) How can you assume that [itex]S_n=0[/itex]?

The irreducibility condition on a Markov Chain is that you can start to any state and, given some finite number of steps, it's possible to get to any state. So, to prove that a Markov Chian is NOT irreducible (which is what we're doing here), you only have to exhibit a single state from which it is not possible to get to some other state. I chose [itex]S_n = 0[/itex], since I happen to know that this is such a state ([itex]S_n=m[/itex] will also work, for the same reasons).

Edwinkumar said:
2) How can you condition [itex]S_n[/itex] instead of [itex]\mathcal{F}_n[/itex]?

That's basically a shorthand. The underlying, general definition of the martingale works in terms of filtrations, but we sometimes abbreviate this by instead referring to a random variable defined on the same [itex]\sigma[/itex]-algebra. If you're taking a measure-theoretic probability class, they'll probably cover this issue explicitly.

Edwinkumar said:
3) Moreover, can you define some stopping time [itex]\tau[/itex] so that the stopped process is a Martingale?

A stopping time with respect to what stochastic process? A finite-state Markov Chain? Or a martingale?
 
  • #9
quadraphonics said:
A stopping time with respect to what stochastic process? A finite-state Markov Chain? Or a martingale?
With respect to the finite state irreducible markov chain.

quadraphonics said:
Notice that this is not the case for Markov Chains with infinite state spaces. Since there is no "edge" to the state space, it's easy to construct non-trivial conditional distributions with the required expected values, which then gives an irreducible chain. Can you think of an example?

I don't understand why is it not working in case of a an irreducible Markov chain with infinite state space. Can you please explain to me?
Thanks.
 
  • #10
Edwinkumar said:
With respect to the finite state irreducible markov chain.

Maybe [itex]\sum_{i=0}^{n}S_i/n[/itex] would work?

Edwinkumar said:
I don't understand why is it not working in case of a an irreducible Markov chain with infinite state space. Can you please explain to me?

Well, if the state space if (doubly) infinite: [itex]S_n \in \mathbb{Z}[/itex], then the Random Walk construction mentioned in the previous posts is both an irreducible Markov Chain and a martingle. The Random Walk, recall, is when the transition matrix for the Markov Chain is given by [itex]P(S_{n+1}=s+1|S_n=s)=P(S_{n+1}=s-1|S_n=s)=0.5[/itex].
 
  • #11
quadraphonics said:
Maybe [itex]\sum_{i=0}^{n}S_i/n[/itex] would work?[/itex].
No! I want a stopping time(an integer valued random variable) [itex]\tau[/itex] for my finite state irreducible markov chain [itex]S_n[/itex] such that the stopped process [itex]S_{\tau \wedge n}[/itex] is a Martingale.
 
  • #12
What about a random walk that then stops when it hits either 0 or m?
 
  • #13
Do you mean either [itex]\tau=0[/itex] or [itex]\tau=m[/itex]?
Moreover, can you give an example of a Martingale which is not a Markov chain?
 
  • #14
Edwinkumar said:
Do you mean either [itex]\tau=0[/itex] or [itex]\tau=m[/itex]?

No, [itex]\tau[/itex] will be whatever time step [itex]S_n[/itex] first equals either 0 or m.

Edwinkumar said:
Moreover, can you give an example of a Martingale which is not a Markov chain?

Do you mean specifically a discrete-time, finite-state martingale that is not a first-order Markov chain?
 
  • #15
quadraphonics said:
No, [itex]\tau[/itex] will be whatever time step [itex]S_n[/itex] first equals either 0 or m.

Do you mean [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex]?
 
  • #16
Edwinkumar said:
Do you mean [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex]?

Indeed.
 
  • #17
Thank you very much quadraphonics.
One final question.
Can you prove that the stopped process [itex]Y_n=X_{\tau \wedge n}[/itex], where [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex] is a martingale w.r.to the natural filtration [itex]\mathcal{F}_n=\sigma(X_0, X_1,..., X_n)[/itex]
 
  • #18
Edwinkumar said:
Thank you very much quadraphonics.
One final question.
Can you prove that the stopped process [itex]Y_n=X_{\tau \wedge n}[/itex], where [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex] is a martingale w.r.to the natural filtration [itex]\mathcal{F}_n=\sigma(X_0, X_1,..., X_n)[/itex]

Yes, it's a straightforward application of the material I've already presented in this thread. Just show that the proposed stopped Markov Chain satisfies the martingale properties.
 

1. What is an Irreducible Markov Chain?

An Irreducible Markov Chain is a type of stochastic process in which the future state of the system depends only on its current state and not on any of its previous states. It is also known as a homogeneous Markov chain because the transition probabilities between states remain constant over time.

2. How is a Markov Chain proved to be Irreducible?

To prove that a Markov Chain is irreducible, we need to show that there is a non-zero probability of transitioning from any state to any other state in the chain. This can be done by constructing a transition matrix and showing that it is possible to reach any state from any other state by following a sequence of transitions.

3. What is a Martingale in the context of Markov Chains?

A Martingale is a type of stochastic process in which the expected value of the next state in the chain is equal to the current state, regardless of the previous states. In other words, the future state of the system is independent of the past states.

4. How do you prove that an Irreducible Markov Chain is not a Martingale?

To prove that an Irreducible Markov Chain is not a Martingale, we need to show that the expected value of the next state does not equal the current state. This can be done by constructing a counterexample, where the transition probabilities between states do not satisfy the Martingale property.

5. What are the applications of proving Irreducible Markov Chains not Martingales?

The applications of proving Irreducible Markov Chains not Martingales include understanding the behavior of complex systems, predicting future states in stochastic processes, and developing efficient algorithms for various problems in mathematics, physics, and computer science.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
27K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
962
Back
Top