Definition:Markov Chain

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a stochastic process over a countable set $S$.

Let $\map \Pr X$ denote the probability of the random variable $X$.

Let $\sequence {X_n}_{n \mathop \ge 0}$ satisfy the Markov property:

$\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.


That is, such that the conditional probability of $X_{i + 1}$ is dependent only upon $X_i$ and upon no earlier values of $\sequence {X_n}$.

That is, the state of $\sequence {X_n}$ in the future is unaffected by its history.


Then $\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain.


State Space

The set $S$ is called the state space of the Markov chain.


Homogeneity

$\sequence {X_n}_{n \mathop \ge 0}$ is homogeneous if and only if $\condprob {X_{n + 1} = j} {X_n = i}$ does not depend on the value of $n$ for all $i, j \in S$.


Also known as

A Markov chain is also known as a Markov process.


Source of Name

This entry was named for Andrey Andreyevich Markov.


Sources