Violation of the Markov property

  • Thread starter carllacan
  • Start date
  • Tags
    Property
In summary, the conversation discusses whether extending a methereological model to consider season change would violate the Markov property. The expert explains that it depends on the definition of "state" and provides an example of how to make the model Markov.
  • #1
carllacan
274
3
Suppose we have a methereological model which describes the weather change as a Markov chain: we assume that the weather on each day depends only on the weather of the previous day. Suppose further that we extended this model so as to consider season change. defining a different transition table for each season. Would this violate the Markov property of the season change?

I think it would not, since the weather would still depend just on the previous day, no matter if it was another season. Am I right?
 
Physics news on Phys.org
  • #2
carllacan said:
the weather would still depend just on the previous day, no matter if it was another season. Am I right?

You haven't given a definition of what a "state" is in your model. Whether it is Markov or non-Markov depends on that definition.

The various pieces of information in a model or a real life process don't, by themselves determine, whether the model is a Markov process. To have a Markov process, the next "state" must depend only on the previous "state", so the Markov-ness or non_Markov-ness of the model depends on how you define "state".

If you define "state" to be the single variable W that represents "the weather for the day" then your model is not Markov since W for the next day depends on both W for the previous day and the season. A variable representing the season is not present in that definition of "state".

On way to make your model Markov is to define a "state" to be a vector of 3 pieces of information (J,S,W) where J is the julian date (1,2,...365), S is the season (summer, fall, winter,spring) and W is the weather (however you care to classify it).

(The Julian date is needed in order to define how the seasons change. For example to define when Spring begins, you need transition rules that say if the previous state is ( 78,winter,W= anything) then the next state has the form (79,spring,W = something).
 
Last edited:
  • Like
Likes 1 person
  • #3
Stephen Tashi said:
You haven't given a definition of what a "state" is in your model. Whether it is Markov or non-Markov depends on that definition.

The various pieces of information in a model or a real life process don't, by themselves determine, whether the model is a Markov process. To have a Markov process, the next "state" must depend only on the previous "state", so the Markov-ness or non_Markov-ness of the model depends on how you define "state".

If you define "state" to be the single variable W that represents "the weather for the day" then your model is not Markov since W for the next day depends on both W for the previous day and the season. A variable representing the season is not present in that definition of "state".

On way to make your model Markov is to define a "state" to be a vector of 3 pieces of information (J,S,W) where J is the julian date (1,2,...365), S is the season (summer, fall, winter,spring) and W is the weather (however you care to classify it).

(The Julian date is needed in order to define how the seasons change. For example to define when Spring begins, you need transition rules that say if the previous state is ( 78,winter,W= anything) then the next state has the form (79,spring,W = something).

Thank you, great explanation.
 

1. What is the Markov property?

The Markov property is a fundamental concept in probability theory and stochastic processes. It states that the future state of a system depends only on the current state, and not on the sequence of events that led to the current state.

2. What does it mean to violate the Markov property?

Violation of the Markov property occurs when the future state of a system depends not only on the current state, but also on the sequence of events that led to the current state. This means that the system is not memoryless and the Markov property does not hold.

3. What are some examples of systems that violate the Markov property?

One example is a stock market, where the future price of a stock depends not only on its current price, but also on past prices and other market events. Another example is weather forecasting, as the future weather depends on past weather patterns and conditions.

4. What are the consequences of violating the Markov property?

Violating the Markov property can make it more difficult to model and predict the behavior of a system. It also means that the system has some form of memory, and its future behavior is influenced by events that occurred in the past.

5. How can we deal with the violation of the Markov property in modeling and analysis?

There are various methods that can be used to deal with the violation of the Markov property, such as incorporating additional variables or using more complex models. Another approach is to use non-Markovian models, which explicitly account for the system's memory and the influence of past events on future behavior.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
944
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
Replies
93
Views
5K
  • Precalculus Mathematics Homework Help
Replies
4
Views
747
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top