[Graph Theory] Markov Chains : Part 2 - Spectral View of Markov Chains and Random Walks
Outlines
- References
- 1. Eigenspace of Transition Matrix
- 2. Spectral Analysis of Markov Chains
- 3. Random Walks on Graphs
References
- A Tutorial on the Spectral Theory of Markov Chains, Seabrook E, Wiskott L., 2023
- Everything about Markov Chains., Cai Leran
- MIT OCW 6.041 Probabilistic Systems Analysis And Applied Probability - Lecture 16, John Tsitsiklis
1. Eigenspace of Transition Matrix
-
As previously discussed in Part 1, the eigenspace of the transition matrix $\large P$ has great significance in interpreting the dynamic behavior of Markov chains.
-
Given a irreducible Markov chain, $\large P$ of the chain is diagonalized as $\large Y_{R} \, \Lambda \, Y_{R}^{-1}$ where $\large Y_{R} \,\in\, \mathbb{R}^{N\times N}$ is the matrix whose $\large j$-th column vector is right eigenvector $\large r_{j} \,\in\, \mathbb{R}^{N}$ and $\large Y_{R}^{-1}$ consists of row vectors corresponding to the transpose of left eigenvectors $\large l_{i} \,\in\, \mathbb{R}^{N}$ of $\large P$.
-
$\large \Lambda$ is a diagonal matrix whose $\large i$-th entry is the eigenvalue of $\large i$-th pair of left and right eigenvectors.
-
Regarding to the eigenspace of diagonalizable $\large P$, the main statements referred in this section is following three.
-
All complex modulus of eigenvalue $\large |\lambda| \, \leq \, 1$.
-
This is the result from a well-known Perron-Frobenius theorem, saying that for a non-negative irreducible matrix, there exists a unique positive eigenvalue (the Perron root, $\large \lambda_{pf}$) corresponding to a positive eigenvector, and it’s the largest eigenvalue in modulus.
-
Then, $\large \lambda_{pf}$ is the spectral radius of the matrix.
-
Even though the theorem directly limits the matrix to be irreducible, can generalize it to smaller irreducible blocks within the reducible matrices.
-
-
$\large \lambda = 1$ is the dominant eigenvalue (i.e., $\large \lambda_{pf}$) and thus any other eigenvalues are smaller than 1.
-
Intuitively, consider any $\large \lambda$ larger than 1, then the Markov chain, described as $\large \sigma P^{t} \, = \, \pi \, + \, \sum_{i=1}^{N} c_{i} \lambda_{i}^{t} l_{i}^{T}$, will exponentially grow with time or otherwise $\large \sigma P^{t}$ contains some negative entries as t grows, both contradicting the fact that $\large P$ is a stochastic probability distribution.
-
-
The eigenspace of $\large \lambda = 1$ is guaranteed to be one-dimensional in irreducible chain.
-
$\large \pi P = \pi$, which makes $\large \pi$ a left-eigenvector with eigenvalue 1.
-
$\large \pi$ is not necessarily a stationary distribution of the chain.
-
In this case, the spectral radius of $\large P$ is 1.
-
-
For irreducible and aperiodic chains, $\large |\lambda| = 1$ is only possible for $\large \lambda = 1$.
-
2. Spectral Analysis of Markov Chains
-
By analyzing the spectral distribution of eigenvalues of $\large P$, we can get some fundamental information about the Markov chains.
-
Firstly, I will consider the case of irreducible chains that have $\large \lambda$ 1 and then generalize the results to reducible chains.
-
First of all, the temporal evolution of a Markov chain is fully described by a series of powers of $\large \lambda$.