Determining the states in Markov chains

  • Thread starter stripes
  • Start date
  • Tags
    States
In summary: The general formula for the number of states is mn, where m is the number of possible events on each day and n is the number of previous days being considered. So for the first example with n=1 and m=3, the number of states is 3^1 = 3, for the second example with n=2 and m=2, the number of states is 2^2 = 4, and for the third example with n=3 and m=2, the number of states is 2^3 = 8. This is a correct way of thinking about Markov chains and finding the number of states in real world examples.
  • #1
stripes
266
0

Homework Statement



This is not a homework question, just me trying to wrap my head around things. My probability class talked about Markov chains for less than 2 hours worth of lecture, and I've been super sick lately, so I'm still a little confused.

If we're considering real world examples where the probability of an event happening tomorrow depends on what happens in the previous n days, where each day, m different events could have happened, is the number of states just mn?

Two practice problems that were given to us were as follows:

Joe's mood depends on his mood from the previous day only (so we're going back the previous 1 day). He could be gloomy, so-so, or cheerful (3 possible events). This means we have a Markov chain with 31 states?

The probability that it rains tomorrow depends on whether it rained the past 2 days (going back the previous 2 days). If the weather on any day can either be rainy or dry, then the number of possible events is 2. So the number of states is 22 = 4?

One more example I came across was the weather one but with more days: suppose the probability that it rains tomorrow depends only on whether it rained the past 3 days (going back the previous 3 days). It can either be rainy or dry (2 possible events). So the number of states is 23 = 8?

I know that these answers are correct but is my way of thinking about right? I will be given such real world examples (involving previous days and what not) on my final exam, so when I'm trying to figure out the number of states in these real world examples, am I doing it correctly?

I guess I could just figure it out by going RRR (rain rain rain), RRD (rain rain dry), RDR, RDD, ... etc. I get this feeling that I'm totally missing the point (missing how to figure things out intuitively). I take it it's because I've missed some class due to illness.

If someone could let me know if I'm going about this correctly I would really appreciate it.

Homework Equations



--

The Attempt at a Solution



--
 
Physics news on Phys.org
  • #2
stripes said:

Homework Statement



This is not a homework question, just me trying to wrap my head around things. My probability class talked about Markov chains for less than 2 hours worth of lecture, and I've been super sick lately, so I'm still a little confused.

If we're considering real world examples where the probability of an event happening tomorrow depends on what happens in the previous n days, where each day, m different events could have happened, is the number of states just mn?

Two practice problems that were given to us were as follows:

Joe's mood depends on his mood from the previous day only (so we're going back the previous 1 day). He could be gloomy, so-so, or cheerful (3 possible events). This means we have a Markov chain with 31 states?

The probability that it rains tomorrow depends on whether it rained the past 2 days (going back the previous 2 days). If the weather on any day can either be rainy or dry, then the number of possible events is 2. So the number of states is 22 = 4?

One more example I came across was the weather one but with more days: suppose the probability that it rains tomorrow depends only on whether it rained the past 3 days (going back the previous 3 days). It can either be rainy or dry (2 possible events). So the number of states is 23 = 8?

I know that these answers are correct but is my way of thinking about right? I will be given such real world examples (involving previous days and what not) on my final exam, so when I'm trying to figure out the number of states in these real world examples, am I doing it correctly?

I guess I could just figure it out by going RRR (rain rain rain), RRD (rain rain dry), RDR, RDD, ... etc. I get this feeling that I'm totally missing the point (missing how to figure things out intuitively). I take it it's because I've missed some class due to illness.

If someone could let me know if I'm going about this correctly I would really appreciate it.

Homework Equations



--

The Attempt at a Solution



--

Your 8 states RRR, RRD, ... are correct; that is the way these things are ususally done.
 

1. How do you determine the states in a Markov chain?

To determine the states in a Markov chain, you need to identify the distinct and recurring states in a system or process. This can be done by examining the transitions between states and identifying any patterns or trends.

2. What is the role of transition probabilities in determining the states in a Markov chain?

Transition probabilities play a crucial role in determining the states in a Markov chain. These probabilities represent the likelihood of moving from one state to another, and they help identify the recurring states in a system.

3. Can the number of states in a Markov chain change over time?

Yes, the number of states in a Markov chain can change over time. As the system or process evolves, new states may emerge, and existing states may disappear. This is why it is important to regularly analyze and update the states in a Markov chain.

4. What are some common methods for determining the states in a Markov chain?

Some common methods for determining the states in a Markov chain include using observational data, conducting surveys or experiments, and using mathematical models and simulations. Each method has its own advantages and limitations, and the most appropriate one will depend on the specific system or process being studied.

5. How can the states in a Markov chain be used for predictive analysis?

The states in a Markov chain can be used for predictive analysis by calculating future probabilities of being in each state based on the current state and transition probabilities. This can help forecast potential outcomes and inform decision-making in various fields, such as finance, marketing, and engineering.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
7K
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
4
Views
740
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
12K
Replies
16
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
Back
Top