Probability theory MOC

Markov chain

A Markov chain is a random sequence of events where the probability of each event depends only on the previous event occurs. Thus such a process may be characterized by a state space and a transition matrix , such that for each we have

A stationary distribution is a distribution of initial states which remains invariant upon transition, i.e. an eigenvector of eigenvalue 1 with entries summing to 1.


develop | en | sembr