Definition:Markov Chain/Homogeneous
Jump to navigation
Jump to search
Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.
$\sequence {X_n}_{n \mathop \ge 0}$ is homogeneous if and only if $\condprob {X_{n + 1} = j} {X_n = i}$ does not depend on the value of $n$ for all $i, j \in S$.
Sources
- 2014: Geoffrey Grimmett and Dominic Welsh: Probability: An Introduction (2nd ed.): $\S 12$: Markov chains