"Two step" Markov chain is also a Markov chain.

In summary: When I quote-reply your original post, I find that each paragraph is inside a set of color-tags with color code #cbd2ac. When I remove those tags, the text becomes readable.
  • #1
caffeinemachine
Gold Member
MHB
816
15
Let $X$ be a compact metric space and $\mathcal X$ be its Borel $\sigma$-algebra. Let $\mathscr P(X)$ be the set of all the Borel probability measures on $X$. A **Markov chain** on $X$ is a measurable map $P:X\to \mathscr P(X)$. We write the image of $x$ under $P$ as $P_x$. (Here $\mathscr P(X)$ is quipped with the Borel $\sigma$-algebra coming from the weak* topology).

Intuitively, for $E\in \mathcal X$, we think of $P_x(E)$ as the probability of landing inside $E$ in the next step given that we are sitting at $x$ at the current instant.

Now let $\mu$ be a probability measure on $X$. We define a new measure $\nu$ on $X$ as follows:

$$\nu(E) = \int_X P_x(E)\ d\mu(x)$$

Intuitively, suppose in the firt step we land in $X$ according to $\mu$. The probability of landing on $x\in X$ according to the measure $\mu$ is $d\mu(x)$.
Now let the Markov chain $P$ drive us. Then the probability of landing in $E$ in the next step given that we are at $x$ is $P_x(E)$.
So the probability of landing in $E$ in two steps is $\int_X P_x(E)\ d\mu(x)$.

So we have a map $\mathscr P(X)\to \mathscr P(X)$ which takes a probability measure $\mu$ and produces a new measure $\nu$ as defined above.

Composing $P:X\to \mathscr P(X)$ with $\mathscr P(X)\to \mathscr P(X)$ we again get a map $X\to \mathscr P(X)$, which I will denote by $P^2$.

Question. Is $P^2$ aslo a Markov chain, that is, is $P^2$ aslo Borel measurable?

My guess is that the map $\mathscr P(X)\to \mathscr P(X)$ is actually continuous, which would answer the question in the affirmative. But I am not sure.
 
Physics news on Phys.org
  • #2
Congratulations. You have managed to color your post unreadable.
 
  • #3
Walagaster said:
Congratulations. You have managed to color your post unreadable.
I am not sure what is wrong. It's appearing on my computer just fine.

Perhaps you are viewing it on another device?
 
  • #4
caffeinemachine said:
I am not sure what is wrong. It's appearing on my computer just fine.

When I quote-reply your original post, I find that each paragraph is inside a set of color-tags with color code #cbd2ac. When I remove those tags, the text becomes readable.
 
  • #5
caffeinemachine said:
I am not sure what is wrong. It's appearing on my computer just fine.
Screenshot shows what I'm getting.

View attachment 8502
 

Attachments

  • Screen-Shot-2018-10-20-at-09.30.28.jpg
    Screen-Shot-2018-10-20-at-09.30.28.jpg
    17.3 KB · Views: 72

What is a "Two step" Markov chain?

A "Two step" Markov chain is a type of Markov chain that uses two previous states to determine the probability of transitioning to the next state. This is different from a traditional Markov chain, which only uses one previous state.

How is a "Two step" Markov chain different from a traditional Markov chain?

A "Two step" Markov chain takes into account two previous states to determine the probability of transitioning to the next state, while a traditional Markov chain only considers one previous state. This allows for a more complex analysis of the system's behavior.

In what types of systems is a "Two step" Markov chain commonly used?

"Two step" Markov chains are commonly used in systems where the behavior is influenced by multiple factors and not just the previous state. This can include economic models, weather forecasting, and biological systems.

What are the benefits of using a "Two step" Markov chain?

Using a "Two step" Markov chain allows for a more accurate analysis of systems that are influenced by multiple factors. It also allows for a more complex understanding of the system's behavior and can help in predicting future states.

Can any Markov chain be considered a "Two step" Markov chain?

No, not all Markov chains can be considered "Two step" Markov chains. A "Two step" Markov chain must use two previous states to determine the probability of transitioning to the next state. Traditional Markov chains only use one previous state, so they cannot be considered "Two step" Markov chains.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
15
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top