Definition:Random Variable/Discrete/Definition 1

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $\struct {S, \Sigma'}$ be a measurable space.

A discrete random variable on $\struct {\Omega, \Sigma, \Pr}$ taking values in $\struct {S, \Sigma'}$ is a mapping $X: \Omega \to S$ such that:

$(1): \quad$ The image of $X$ is a countable subset of $S$
$(2): \quad$ $\forall x \in S: \set {\omega \in \Omega: \map X \omega = x} \in \Sigma$


Alternatively, the second condition can be written as:

$(2): \quad$ $\forall x \in S: X^{-1} \sqbrk {\set x} \in \Sigma$

where $X^{-1} \sqbrk {\set x}$ denotes the preimage of $\set x$.


Discussion

The meaning of condition $(2)$ in this context can be explained as follows:

Let $\EE$ be an experiment with probability space $\struct {\Omega, \Sigma, \Pr}$.

Suppose $X$ is a discrete random variable modelling the outcome of $\EE$. Then it takes values in $\R$. But we don't know what the actual value of $X$ is going to be, since the outcome of $\EE$ involves chance.

What we can do, though, is determine the probability that $X$ takes any particular value $x$.

To do this, we note that $X$ has the value $x$ if and only if the outcome of $\EE$ lies in the subset of $\Omega$ which is mapped to $x$.

But for any such element $x$ of the image of $X$, the preimage of $x$ is an element of $\Sigma$.


Sources