Markov chains and steady state vectors

AI Thread Summary
Markov chains are mathematical models where the future state of a system depends only on its current state, not on past states. They can be represented using a probability matrix, where the initial state probabilities evolve through matrix multiplication over generations. A state space diagram can visually illustrate the transitions between states, with links indicating transition probabilities. This model is applicable to various real-world problems, particularly in engineering, where systems exhibit first-order behavior with probabilistic controls. Understanding Markov chains can enhance the ability to predict system developments effectively.
Physics is Phun
Messages
100
Reaction score
0
In my data management class we are studying matricies, we have broken the whole unit up into sections and then in small groups we have to teach the rest of the class a section or the matrix unit. Well, our section is markov chains. I understand it myself but I don't know if I'll be able to explain/teach it to others very well.
Could someone go over the key points and try and put it in simple terms, so I can best explain it to the rest of my class?

Thanks
 
Mathematics news on Phys.org
www.mathworld.com

markov chains are Pn = P0^N where P0 is a probability matrix right? and N is the number of generations?
 
Markov chains are a sequence of random variables X_1,...,X_n, where probability that a system is in state x_n at time t_n is exclusively dependent on the probability that the system is in state x_(n-1) at time t_(n-1).

In a very informal way we can say, that this is a record of n occurrences and each occurrence is such that its completely dependent only on the occurrence that occurred just before it and no other previous occurrence. (If you a system control person, you can think it of as first order linear/nonlinear system)

A more convenient way to think of this is to draw a state space diagram with every link from one state to other state denoting the probability of transition.

These probabilities of transition can be recorded in the form of a matrix (say P). If Q_0 denotes the initial probability (the probability with which a system is in some initial state), then the further developments of the system can be predicted as,
Q_1 = Q_0*P
Q_2 = Q_1*P
and so on...

Note that this is just a mathematical model and there are many mathematical works made on this model. What this means is, if you ever find a real world problem which follows this model, then you can directly apply all the results that you see for markov model to your real world problem.

Interestingly, this simple model is highly useful in many places. Many of the real world engineering problems are first order linear/non-linear type and if there is a probabilistic control associated with it, then it becomes the perfect candidate for a markov model.

-- AI
 
Suppose ,instead of the usual x,y coordinate system with an I basis vector along the x -axis and a corresponding j basis vector along the y-axis we instead have a different pair of basis vectors ,call them e and f along their respective axes. I have seen that this is an important subject in maths My question is what physical applications does such a model apply to? I am asking here because I have devoted quite a lot of time in the past to understanding convectors and the dual...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Back
Top