Definition:Markov Chain/Homogeneous

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.


$\sequence {X_n}_{n \mathop \ge 0}$ is homogeneous if and only if $\condprob {X_{n + 1} = j} {X_n = i}$ does not depend on the value of $n$ for all $i, j \in S$.


Sources