Violation of the Markov property

  • Context: Undergrad 
  • Thread starter Thread starter carllacan
  • Start date Start date
  • Tags Tags
    Property
Click For Summary
SUMMARY

The discussion centers on the implications of defining a Markov chain in a meteorological model that incorporates seasonal changes. It concludes that the model can maintain the Markov property if the "state" is defined as a vector comprising the Julian date (J), season (S), and weather (W). If the state is defined solely as the weather for the day (W), the model fails to be Markov since the next day's weather depends on both the previous day's weather and the season. Thus, the definition of "state" is crucial in determining the Markov-ness of the model.

PREREQUISITES
  • Understanding of Markov chains and the Markov property
  • Familiarity with meteorological modeling concepts
  • Knowledge of state definitions in probabilistic models
  • Basic understanding of Julian dates and seasonal transitions
NEXT STEPS
  • Research the implications of state definitions in Markov processes
  • Explore advanced meteorological modeling techniques
  • Learn about vector state representations in stochastic models
  • Investigate the role of seasonal factors in predictive modeling
USEFUL FOR

Mathematicians, data scientists, meteorologists, and anyone involved in modeling weather patterns or studying Markov processes.

carllacan
Messages
272
Reaction score
3
Suppose we have a methereological model which describes the weather change as a Markov chain: we assume that the weather on each day depends only on the weather of the previous day. Suppose further that we extended this model so as to consider season change. defining a different transition table for each season. Would this violate the Markov property of the season change?

I think it would not, since the weather would still depend just on the previous day, no matter if it was another season. Am I right?
 
Physics news on Phys.org
carllacan said:
the weather would still depend just on the previous day, no matter if it was another season. Am I right?

You haven't given a definition of what a "state" is in your model. Whether it is Markov or non-Markov depends on that definition.

The various pieces of information in a model or a real life process don't, by themselves determine, whether the model is a Markov process. To have a Markov process, the next "state" must depend only on the previous "state", so the Markov-ness or non_Markov-ness of the model depends on how you define "state".

If you define "state" to be the single variable W that represents "the weather for the day" then your model is not Markov since W for the next day depends on both W for the previous day and the season. A variable representing the season is not present in that definition of "state".

On way to make your model Markov is to define a "state" to be a vector of 3 pieces of information (J,S,W) where J is the julian date (1,2,...365), S is the season (summer, fall, winter,spring) and W is the weather (however you care to classify it).

(The Julian date is needed in order to define how the seasons change. For example to define when Spring begins, you need transition rules that say if the previous state is ( 78,winter,W= anything) then the next state has the form (79,spring,W = something).
 
Last edited:
  • Like
Likes   Reactions: 1 person
Stephen Tashi said:
You haven't given a definition of what a "state" is in your model. Whether it is Markov or non-Markov depends on that definition.

The various pieces of information in a model or a real life process don't, by themselves determine, whether the model is a Markov process. To have a Markov process, the next "state" must depend only on the previous "state", so the Markov-ness or non_Markov-ness of the model depends on how you define "state".

If you define "state" to be the single variable W that represents "the weather for the day" then your model is not Markov since W for the next day depends on both W for the previous day and the season. A variable representing the season is not present in that definition of "state".

On way to make your model Markov is to define a "state" to be a vector of 3 pieces of information (J,S,W) where J is the julian date (1,2,...365), S is the season (summer, fall, winter,spring) and W is the weather (however you care to classify it).

(The Julian date is needed in order to define how the seasons change. For example to define when Spring begins, you need transition rules that say if the previous state is ( 78,winter,W= anything) then the next state has the form (79,spring,W = something).

Thank you, great explanation.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 93 ·
4
Replies
93
Views
7K
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
24
Views
4K