Definition:Convergence in Probability
Jump to navigation
Jump to search
Definition
Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.
Let $\sequence {X_n}_{n \mathop \in \N}$ be a sequence of real-valued random variables on $\struct {\Omega, \Sigma, \Pr}$.
Let $X$ be a real-valued random variable on $\struct {\Omega, \Sigma, \Pr}$.
We say that $\sequence {X_n}_{n \mathop \in \N}$ converges in probability to $X$ if and only if:
- $\ds \forall \epsilon \in \R_{>0}: \lim_{n \mathop \to \infty} \map \Pr {\size {X_n - X} < \epsilon} = 1$
This is written:
- $X_n \xrightarrow p X$
Sources
- 2002: George C. Casella and Roger L. Berger: Statistical Inference (2nd ed.): $5.5$: Convergence Concepts: Definition $5.5.1$
- 2011: Morris H. DeGroot and Mark J. Schervish: Probability and Statistics (4th ed.): $6.2$: The Law of Large Numbers: Definition $6.2.1$