Definition:Convergence in Probability

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $\sequence {X_n}_{n \mathop \in \N}$ be a sequence of real-valued random variables on $\struct {\Omega, \Sigma, \Pr}$.

Let $X$ be a real-valued random variable on $\struct {\Omega, \Sigma, \Pr}$.


We say that $\sequence {X_n}_{n \mathop \in \N}$ converges in probability to $X$ if and only if:

$\ds \forall \epsilon \in \R_{>0}: \lim_{n \mathop \to \infty} \map \Pr {\size {X_n - X} < \epsilon} = 1$


This is written:

$X_n \xrightarrow p X$


Sources