# Finding the rate of convergence for a markov chain

• bonfire09
In summary: Rate of convergence is the speed at which the Markov chain moves towards the stationary distribution. It is typically denoted by the symbol ##\mathbf{R}##.
bonfire09

## Homework Statement

For the following Markov chain, find the rate of convergence to the stationary distribution:
$\begin{bmatrix} 0.4 & 0.6 \\ 1 & 0 \end{bmatrix}$

none

## The Attempt at a Solution

I found the eigenvalues which were $\lambda_1=-.6$ or $\lambda_2=1$. The corresponding eigenvectors I found were $\vec{v_1}=\begin{bmatrix} -0.6 & 1 \end{bmatrix}$ and $\vec{v_2}=\begin{bmatrix} -1 & 1 \end{bmatrix}$. The stationary distribution which I found that satisfies $p=pA$(A is the transition matrix) and $p_1+p_2=1$ is $\vec{p}=\begin{bmatrix} .625 & .375 \end{bmatrix}$. From here I do not know how to get the rate of convergence. I think it has something to do with the eigenvalues or eigenvectors. Any help would be great thanks.[/B]

Last edited:
I checked the eigenvectors. I had them switched. So do I use the formula right below where it "since the eigenvectors are orthonormal..."?

That depends on what you think it does. It's best to only use formulas that one understands. Without going through every step of the maths, do you understand broadly why that formula has that form?

Actually I don't. We talked about limiting distributions and stationary distributions in class but did not cover rate of convergence. In essence I only understand the very basics. So somehow I have to show the rate of convergence with what I have so far.

Well, the matrix eigenvectors will be orthogonal and hence form a basis for the vector space. So any vector ##\mathbf{x}## can be expressed as a linear sum ##\sum_{i=1}^na_i\mathbf{u}_i## of eigenvectors, as shown in the wiki article. If only one of the eigenvalues, say ##\lambda_1##, corresponding to eigenvector ##\mathbf{u}_1##, has absolute value equal to 1, and all others have absolute values less than 1, what will happen to ##\mathbf{x}=\sum_{i=1}^na_i\mathbf{u}_i## as you repeatedly right-multiply it by the matrix? Think about the impact of each term in the sum separately.

bonfire09 said:

## Homework Statement

For the following Markov chain, find the rate of convergence to the stationary distribution:
$\begin{bmatrix} 0.4 & 0.6 \\ 1 & 0 \end{bmatrix}$

none

## The Attempt at a Solution

I found the eigenvalues which were $\lambda_1=-.6$ or $\lambda_2=1$. The corresponding eigenvectors I found were $\vec{v_1}=\begin{bmatrix} -0.6 & 1 \end{bmatrix}$ and $\vec{v_2}=\begin{bmatrix} -1 & 1 \end{bmatrix}$. The stationary distribution which I found that satisfies $p=pA$(A is the transition matrix) and $p_1+p_2=1$ is $\vec{p}=\begin{bmatrix} .625 & .375 \end{bmatrix}$. From here I do not know how to get the rate of convergence. I think it has something to do with the eigenvalues or eigenvectors. Any help would be great thanks.[/B]

Please tun off the bold font, reserving it for emphasis on individual words, etc, like what I do below.

Anyway, do you know the definition of rate of convergence? Is this discussed in your textbook or course notes? Even if it is in a section of your text that you have not reached yet, I would bet you would be allowed to read it ahead of time and use the material as needed. If such material is not in your text or course notes, please tell us so we can point you in the appropriate direction.

## 1. What is a Markov chain?

A Markov chain is a mathematical model that describes a system that changes states over time, where the probability of transitioning from one state to another depends only on the current state and not on any previous states.

## 2. What is the rate of convergence for a Markov chain?

The rate of convergence for a Markov chain refers to how quickly the system approaches its long-term equilibrium state. It is a measure of how fast the probabilities of being in each state stabilize over time.

## 3. How is the rate of convergence calculated for a Markov chain?

The rate of convergence for a Markov chain can be calculated using various methods, such as calculating the eigenvalues and eigenvectors of the transition matrix or using the spectral radius of the transition matrix. Different methods may be more appropriate depending on the specific Markov chain being analyzed.

## 4. What factors affect the rate of convergence for a Markov chain?

The rate of convergence for a Markov chain can be affected by the structure of the transition matrix, the initial state distribution, and the number of steps or iterations of the chain. In general, the rate of convergence will be faster for Markov chains with simpler structures and more evenly distributed initial states.

## 5. Why is it important to find the rate of convergence for a Markov chain?

Knowing the rate of convergence for a Markov chain can provide valuable insights into the behavior and stability of the system. It can also be used to compare different Markov chains and identify which ones converge faster. Additionally, the rate of convergence can be used to determine the number of steps needed for a Markov chain to reach a certain level of accuracy.

• Calculus and Beyond Homework Help
Replies
2
Views
1K
• Calculus and Beyond Homework Help
Replies
2
Views
882
• Calculus and Beyond Homework Help
Replies
6
Views
1K
• Calculus and Beyond Homework Help
Replies
6
Views
1K
• Calculus and Beyond Homework Help
Replies
19
Views
3K
• Calculus and Beyond Homework Help
Replies
4
Views
513
• Calculus and Beyond Homework Help
Replies
1
Views
1K
• Calculus and Beyond Homework Help
Replies
11
Views
2K
• Calculus and Beyond Homework Help
Replies
6
Views
483
• Calculus and Beyond Homework Help
Replies
3
Views
2K