Hello, This is my first question here. So, lets see how it goes. I need a proof to a simple problem. The proof seems so obvious that I am finding it hard to prove it. The question is as follows:(adsbygoogle = window.adsbygoogle || []).push({});

Let X = {X_n} (n=0 to infinity) denote a Markov Chain (MC) on a finite state space 'S' and with transition matrix 'P'. Consider the MC, Y = {Y_n} (n=0 to infinity), with the setting, Y_k = X_2k. Prove or disprove that Y is a MC.

On the face of it, it is apparent that Y must be a Markov Chain as it will have to satisfy the Markov property (because X satisfies it and Y is derived from it). But, how can we formally prove this?

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Proof for Markov Chains

**Physics Forums | Science Articles, Homework Help, Discussion**