1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Markov process

  1. Apr 25, 2010 #1
    If a Markov transition matrix's columns add up to 1, the matrix is a normal Markov transition matrix. Since it is normal, the Markov process must converge to a steady-state vector.

    Is this correct?
  2. jcsd
  3. Apr 25, 2010 #2
    Well, no. For example, consider the following Markov transition matrix:

    0 & 1 \\
    1 & 0 \\

    The result of this process is that at each step the result vector is flipped, ie (1 0) becomes (0 1), etc. So the Markov process itself doesn't converge to a steady state vector. However, the long-term time average does converge. However, if the Markov transition matrix is regular (that is, all the entries are positive instead of just being non-negative), then the Markov process will converge.
    Last edited: Apr 25, 2010
  4. Apr 25, 2010 #3
    The steady-state vector is the "long-term" vector since to obtain the the steady-state vector we run to limit as [itex]n\rightarrow\infty[/itex] where n is:


    where [itex]\mathbf{x}_0[/itex] is the initial state vector
  5. Apr 25, 2010 #4
    But this limit does not always exist.

    Consider the matrix in my previous post, with x0 = (1/3,2/3). Now if n is even, the expression
    [tex]XD^nX^{-1}\mathbf{x}_0[/tex] = (1/3,2/3)
    while if n is odd,
    [tex]XD^nX^{-1}\mathbf{x}_0[/tex] = (2/3,1/3)

    so the limit doesn't exist.

    Putting this in more concrete terms, suppose there is a stoplight that can be either red or green. Every minute, it changes color. What is the steady-state vector? It's not a steady state -- at any particular point in the future it will be either deterministically red or green. However, the long-term time average:

    [tex] \frac{1}{N}\stackrel{lim}{N\rightarrow\infty} \sum_{n=1}^N A^n x_0 [/tex]

    will converge to (1/2,1/2).
  6. Apr 25, 2010 #5
    The probability vector

    \end{bmatrix}[/tex] is a steady state vector.
  7. Apr 25, 2010 #6
    Yes, but the Markov process I described does not converge to that vector. Only the time-average does.
  8. Apr 25, 2010 #7
    We are dealing with stochastic process so none of the [tex]a_{ij}[/tex] can't be negative . We don't have a stochastic matrix in your example.
    Last edited: Apr 25, 2010
  9. Apr 25, 2010 #8
    Also, I wrote the column vectors add up to 1 not -1 in the transition matrix.
  10. Apr 25, 2010 #9

    0 & 1 \\
    1 & 0 \\


    has columns that add up to 1, and the entries are non-negative.
  11. Apr 25, 2010 #10
    "In general, if A is a nxn stochastic matrix, then [tex]\lambda_1=1[/tex] is an eigenvalue of A and the remaining eignevalues satisfy.....the remaing eigenvalues of A satisfy [tex]\begin{vmatrix}
    \end{vmatrix}[/tex]" (Leon, pg 313-314, 2010).

    Leon, S. (2010). Linear algebra with applications. Upper Saddle River, NJ: Pearson.
  12. Apr 25, 2010 #11
    [tex]1\nless 1[/tex]
  13. Apr 25, 2010 #12
    I don't have a copy of that book, but unless there are additional qualifiers, that statement just isn't right. *REGULAR* stochastic matrices satisfy that condition, but the inequality has to be weakened to "less than or equal to" to apply to all stochastic matrices in general.

    For example:

    Stochastic matrices

    A row (column) stochastic matrix is a square matrix each of whose rows (columns) consists of non-negative real numbers whose sum is unity. Strictly speaking the theorem cannot be applied directly to such matrices because they need not be irreducible. If A is row-stochastic then the column vector with each entry 1 is clearly an eigenvector corresponding to the eigenvalue 1, which is also ρ(A) by the remark above. However it may not be the only eigenvalue on the unit circle and the associated eigenspace can be multi-dimensional.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook