Probability theory MOC

Markov chain

A Markov chain (𝒦,𝑆) is a random sequence of events where the probability of each event depends only on the previous event occurs. Thus such a process may be characterized by a state space 𝑆 and a transition matrix 𝒦 :𝑆 ×𝑆 β†’[0,1], such that for each 𝑠 βˆˆπ‘† we have

βˆ‘π‘ β€²βˆˆπ‘†π’¦(𝑠,𝑠′)=1

A stationary distribution is a distribution of initial states which remains invariant upon transition, i.e. an eigenvector of eigenvalue 1 with entries summing to 1.


develop | en | SemBr