Math Modeling with markov chains

1. Oct 22, 2012

chuy52506

So I am trying to model the different states of education one can achieve. These include Elementary school, Middle school, high school, community college, 4 year university. Each one will be a different state(Ex state 1 = elementary school). Some states will be connected together with a probability. This means that there is a probability of going from one state to another depending on which state you are in.
The state connected are elementary school → middle school;middle school→high school; high school→community college; high school→4 years university;community college→4 year university;4 year university→community college. So you can go from high school to eaith a community college or 4 year university and so on. All these states will be connected to a drop out state which will be an "absorption" state. This means once drop out state is reached there is no leaving it.

I was thinking of modeling this with a state diagram, markov chain to be exact. I'm trying to put this into a transition matrix where the inputs are the probabilities. For each element, Sij, i represents the starting education level state, and j represents the ending education level state for one year. This means that the row is the beginning location, and the column is the ending location after one year. We then multiply this by a initial distribution vector whose inputs are the initial distribution of percentages for being at a given state at t=0. We call this vector V(t). Thus T being the transition matrix, T*V(0)=V(1) and in general T*V(n)=V(n+1).
I know each education level isnt a year long so is there anyone i can include that into my model that also? Am I going in the right direction with this? Im not sure if this is the correct way to utilize markov chains and transition matrices.

2. Oct 22, 2012

chuy52506

Can anyone point me in the right direction?

3. Oct 22, 2012

Stephen Tashi

Explain what your goal is. Are you trying to do a serious analysis of education levels? or are you trying to learn about Markov chains by making up a simple example? - or are you trying to learn about Markov chains by making up a complicated example and implementing in a computer program?

There are Markov chains that proceed through time in discrete steps that there are Markov chains that have distributions that tell the probability that a state change will occur at time t. If you want to follow the usual path of learning about Markov chains, you start with the ones that proceed in discrete steps.