Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof for Markov Chains

  1. Oct 27, 2008 #1
    Hello, This is my first question here. So, lets see how it goes. I need a proof to a simple problem. The proof seems so obvious that I am finding it hard to prove it. The question is as follows:

    Let X = {X_n} (n=0 to infinity) denote a Markov Chain (MC) on a finite state space 'S' and with transition matrix 'P'. Consider the MC, Y = {Y_n} (n=0 to infinity), with the setting, Y_k = X_2k. Prove or disprove that Y is a MC.

    On the face of it, it is apparent that Y must be a Markov Chain as it will have to satisfy the Markov property (because X satisfies it and Y is derived from it). But, how can we formally prove this?
     
  2. jcsd
  3. Oct 28, 2008 #2

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

    Welcome to PF stochfreak.

    You can just check the definition of the Markov property (what is it?) using that X satisfies it.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Proof for Markov Chains
  1. Markov chain (Replies: 1)

  2. Markov chains (Replies: 17)

  3. Markov Chains (Replies: 6)

  4. Markov chains (Replies: 10)

Loading...