Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov chains

  1. Nov 18, 2008 #1
    Can someone prove that an irreducible markov chain on a finite state space {0,1,...,m} is not a Martingale?
     
  2. jcsd
  3. Nov 18, 2008 #2
    Well, if [itex]S_n[/itex] is some irreduceable Markov chain with finite state space. For it to also be a Martingale would require [itex]E(S_{n+1}|S_n) = S_n[/itex]. Consider the case where [itex]S_n = 0[/itex]. Then the Martingale condition would be [itex]E(S_{n+1}|S_n=0) = 0[/itex], which would require that [itex]P(S_{n+1}=0|S_n=0)=1[/itex], which violates the assumption of irreducibility. So, an irreducible Markov Chain with finite state space cannot be a Martingale.

    Notice that this is not the case for Markov Chains with infinite state spaces. Since there is no "edge" to the state space, it's easy to construct non-trivial conditional distributions with the required expected values, which then gives an irreducible chain. Can you think of an example?
     
    Last edited: Nov 18, 2008
  4. Nov 19, 2008 #3
    Thank you very much for your reply.
    Why do you assume that [itex]S_n = 0[/itex]. Moreover why is [itex]P(S_{n+1}=0|S_n=0)=1[/itex]?

    I don't understand what do you mean by 'no "edge" to the state space'.
     
  5. Nov 19, 2008 #4
    I'm confused too. I understand the Martingale condition, but why must that imply [itex]P(S_{n+1} = 0| S_{n} = 0) = 1[/itex]?

    If [itex]P(S_{n+1} = -1 | S_{n} = 0) = 0.5[/itex] and [itex]P(S_{n+1} = 1 | S_{n} = 0) = 0.5[/itex] then the Martingale condition still holds because the expected value is still 0. Is this right or am I missing something?
     
  6. Nov 19, 2008 #5
    The issue is that [itex]S_{n+1}=-1[/itex] is not in the state space, which, remember, consists of [itex]\{0, ... , m\}[/itex]. If the state-space is infinite, then your approach would always work, as there would always be valid parts of the state space both above and below the current state. But for a finite state space, it's impossible to construct non-trivial conditional distributions for [itex]S_n=0[/itex] and [itex]S_n=m[/itex] that satisfy the Martingale condition.
     
  7. Nov 19, 2008 #6
    BTW, the Markov Chain with countable state space and transition probability [itex]P(S_{n+1}=s+1|S_n = s) = P(S_{n+1}=s-1|S_n = s)=1/2[/itex] is the (discrete, symmetric) Random Walk, which is a classic example of a martingale.
     
  8. Nov 19, 2008 #7
    Mr.quadraphonics, I have just started learning Martingales in the classical way(i.e. measure theoretic).
    The definition for a sequence of integrable random variables [itex]S_n[/itex] to be a Martingale with respect to a filtration [itex]\mathcal{F}_n[/itex], if (1) [itex]S_n[/itex] is [itex]\mathcal{F}_n[/itex] measurable and (2) [itex]E[S_{n+1}|\mathcal{F}_n]=S_n[/itex].

    My questions to you are the following:
    1) How can you assume that [itex]S_n=0[/itex]?
    2) How can you condition [itex]S_n[/itex] instead of [itex]\mathcal{F}_n[/itex]?
    3) Moreover, can you define some stopping time [itex]\tau[/itex] so that the stopped process is a Martingale?

    Thank you very much for all your replies.
     
  9. Nov 20, 2008 #8
    The irreducibility condition on a Markov Chain is that you can start to any state and, given some finite number of steps, it's possible to get to any state. So, to prove that a Markov Chian is NOT irreducible (which is what we're doing here), you only have to exhibit a single state from which it is not possible to get to some other state. I chose [itex]S_n = 0[/itex], since I happen to know that this is such a state ([itex]S_n=m[/itex] will also work, for the same reasons).

    That's basically a shorthand. The underlying, general definition of the martingale works in terms of filtrations, but we sometimes abbreviate this by instead referring to a random variable defined on the same [itex]\sigma[/itex]-algebra. If you're taking a measure-theoretic probability class, they'll probably cover this issue explicitly.

    A stopping time with respect to what stochastic process? A finite-state Markov Chain? Or a martingale?
     
  10. Nov 20, 2008 #9
    With respect to the finite state irreducible markov chain.

    I don't understand why is it not working in case of a an irreducible Markov chain with infinite state space. Can you please explain to me?
    Thanks.
     
  11. Nov 21, 2008 #10
    Maybe [itex]\sum_{i=0}^{n}S_i/n[/itex] would work?

    Well, if the state space if (doubly) infinite: [itex]S_n \in \mathbb{Z}[/itex], then the Random Walk construction mentioned in the previous posts is both an irreducible Markov Chain and a martingle. The Random Walk, recall, is when the transition matrix for the Markov Chain is given by [itex]P(S_{n+1}=s+1|S_n=s)=P(S_{n+1}=s-1|S_n=s)=0.5[/itex].
     
  12. Nov 21, 2008 #11
    No! I want a stopping time(an integer valued random variable) [itex]\tau[/itex] for my finite state irreducible markov chain [itex]S_n[/itex] such that the stopped process [itex]S_{\tau \wedge n}[/itex] is a Martingale.
     
  13. Nov 21, 2008 #12
    What about a random walk that then stops when it hits either 0 or m?
     
  14. Nov 22, 2008 #13
    Do you mean either [itex]\tau=0[/itex] or [itex]\tau=m[/itex]?
    Moreover, can you give an example of a Martingale which is not a Markov chain?
     
  15. Nov 24, 2008 #14
    No, [itex]\tau[/itex] will be whatever time step [itex]S_n[/itex] first equals either 0 or m.

    Do you mean specifically a discrete-time, finite-state martingale that is not a first-order Markov chain?
     
  16. Nov 24, 2008 #15
    Do you mean [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex]?
     
  17. Nov 25, 2008 #16
    Indeed.
     
  18. Nov 25, 2008 #17
    Thank you very much quadraphonics.
    One final question.
    Can you prove that the stopped process [itex]Y_n=X_{\tau \wedge n}[/itex], where [itex]\tau(\omega)=min\{n: S_n(\omega)=0 or S_n(\omega)=m\}[/itex] is a martingale w.r.to the natural filtration [itex]\mathcal{F}_n=\sigma(X_0, X_1,..., X_n)[/itex]
     
  19. Nov 25, 2008 #18
    Yes, it's a straightforward application of the material I've already presented in this thread. Just show that the proposed stopped Markov Chain satisfies the martingale properties.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Markov chains
  1. Markov chain (Replies: 1)

  2. Markov Chains (Replies: 6)

  3. Markov chains (Replies: 10)

Loading...