Expectation of Function of Discrete Random Variable
Jump to navigation
Jump to search
Theorem
Let $X$ be a discrete random variable.
Let $\expect X$ be the expectation of $X$.
Let $g: \R \to \R$ be a real function.
Then:
- $\ds \expect {g \sqbrk X} = \sum_{x \mathop \in \Omega_X} \map g x \, \map \Pr {X = x}$
whenever the sum is absolutely convergent.
Proof
Let $\Omega_X = \Img X = I$.
Let $Y = g \sqbrk X$.
Thus:
- $\Omega_Y = \Img Y = g \sqbrk I$
So:
\(\ds \expect Y\) | \(=\) | \(\ds \sum_{y \mathop \in g \sqbrk I} y \, \map \Pr {Y = y}\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds \sum_{y \mathop \in g \sqbrk I} y \sum_{ {x \mathop \in I} \atop {\map g x \mathop = y} } \map \Pr {X = x}\) | Probability Mass Function of Function of Discrete Random Variable | |||||||||||
\(\ds \) | \(=\) | \(\ds \sum_{x \mathop \in I} \map g x \, \map \Pr {X = x}\) |
From the definition of expectation, this last sum applies only when the last sum is absolutely convergent.
$\blacksquare$
Sources
- 1986: Geoffrey Grimmett and Dominic Welsh: Probability: An Introduction ... (previous) ... (next): $\S 2.4$: Expectation: Theorem $2 \ \text{B}$