Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof for Markov Chains

  1. Oct 27, 2008 #1
    Hello, This is my first question here. So, lets see how it goes. I need a proof to a simple problem. The proof seems so obvious that I am finding it hard to prove it. The question is as follows:

    Let X = {X_n} (n=0 to infinity) denote a Markov Chain (MC) on a finite state space 'S' and with transition matrix 'P'. Consider the MC, Y = {Y_n} (n=0 to infinity), with the setting, Y_k = X_2k. Prove or disprove that Y is a MC.

    On the face of it, it is apparent that Y must be a Markov Chain as it will have to satisfy the Markov property (because X satisfies it and Y is derived from it). But, how can we formally prove this?
  2. jcsd
  3. Oct 28, 2008 #2


    User Avatar
    Science Advisor
    Homework Helper

    Welcome to PF stochfreak.

    You can just check the definition of the Markov property (what is it?) using that X satisfies it.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook