Markov chain
A Markov chain
A stationary distribution is a distribution of initial states which remains invariant upon transition, i.e. an eigenvector of eigenvalue 1 with entries summing to 1.
A Markov chain
A stationary distribution is a distribution of initial states which remains invariant upon transition, i.e. an eigenvector of eigenvalue 1 with entries summing to 1.