Definition:Markov Chain/State Space

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.


The set $S$ is called the state space of the Markov chain.


Sources