Transforms that Preserve The Dominant Eigenvector?

  • Context: Graduate 
  • Thread starter Thread starter csguy
  • Start date Start date
  • Tags Tags
    Eigenvector
Click For Summary
SUMMARY

This discussion focuses on the preservation of the dominant eigenvector in stochastic matrices, specifically when transforming these matrices to eliminate the antidiagonal elements. The user references established methods such as the QR method, Jacobi rotation, and Householder matrices, seeking suggestions for techniques that maintain the dominant eigenvector during transformation. The proposed transformation involves constructing a new matrix B that retains the eigenvalue-eigenvector relationship, ensuring that B multiplied by the dominant eigenvector yields the same eigenvalue.

PREREQUISITES
  • Understanding of stochastic matrices and Markov chains
  • Familiarity with eigenvalues and eigenvectors
  • Knowledge of matrix transformations and their properties
  • Experience with numerical methods such as QR decomposition and Jacobi rotations
NEXT STEPS
  • Research methods for preserving eigenvectors in matrix transformations
  • Explore the QR method in detail for its application in eigenvalue problems
  • Study Householder transformations and their impact on matrix structure
  • Investigate the implications of modifying antidiagonal elements in stochastic matrices
USEFUL FOR

Mathematicians, data scientists, and researchers working with Markov chains and stochastic processes who are interested in matrix transformations and eigenvalue analysis.

csguy
Messages
2
Reaction score
0
Hi,

I'm working with stochastic matrices (square matrices where each entry is a probability of moving to a different state in a Markov chain) and I am looking for transforms that would preserve the dominant eigenvector (the "stationary distribution" of the chain). What I want to do is to cause the antidiagonal of the matrix to be zero.

I remember studying a host of methods that would preserve the spectrum (e.g. QR method, Jacobi rotation, Householder matrices, etc.), but which methods preserve the dominant eigenvector?

Any suggestions?
 
Physics news on Phys.org
Supposing [itex]A \textbf{v} = \lambda \textbf{v}[/itex], where [itex]\textbf{v}, \lambda[/itex] is the dominant eigenvector/eigenvalue pair with components [itex]v_1, v_2, ..., v_n[/itex]. Then you could do something like
[tex]B = \lambda \left[\begin{matrix} 1 & 0 & 0 & ... & 0 \\ \frac{v_2}{v_1} & 0 & 0 & ... & 0<br /> \\ \frac{v_3}{v_1} & 0 & 0 & ... & 0 \\ \vdots & \vdots & \vdots & \ddots & 0 \\ \frac{v_{n-1}}{v_1} & 0 & 0 & ... & 0 \\ 0 & \frac{v_n}{v_2} & 0 & ... & 0 \end{matrix}\right][/tex]

I think [itex]B \textbf{v} = \lambda \textbf{v}[/itex] if you work out the multiplication.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 5 ·
Replies
5
Views
4K
  • Poll Poll
  • · Replies 1 ·
Replies
1
Views
6K
  • Poll Poll
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
1K
  • · Replies 9 ·
Replies
9
Views
13K