Hello, This is my first question here. So, lets see how it goes. I need a proof to a simple problem. The proof seems so obvious that I am finding it hard to prove it. The question is as follows:(adsbygoogle = window.adsbygoogle || []).push({});

Let X = {X_n} (n=0 to infinity) denote a Markov Chain (MC) on a finite state space 'S' and with transition matrix 'P'. Consider the MC, Y = {Y_n} (n=0 to infinity), with the setting, Y_k = X_2k. Prove or disprove that Y is a MC.

On the face of it, it is apparent that Y must be a Markov Chain as it will have to satisfy the Markov property (because X satisfies it and Y is derived from it). But, how can we formally prove this?

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Proof for Markov Chains

Loading...

Similar Threads - Proof Markov Chains | Date |
---|---|

I An easy proof of Gödel's first incompleteness theorem? | Mar 6, 2018 |

I Cantor's decimal proof that (0,1) is uncountable | Sep 27, 2017 |

A A "Proof Formula" for all maths or formal logic? | Apr 19, 2017 |

Markov Process Proof | Jan 19, 2012 |

Proof that a stochastic process isn't a Markov Process | Sep 17, 2004 |

**Physics Forums - The Fusion of Science and Community**