Sum of Expectations of Independent Trials/Proof 2

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\EE_1, \EE_2, \ldots, \EE_n$ be a sequence of experiments whose outcomes are independent of each other.

Let $X_1, X_2, \ldots, X_n$ be discrete random variables on $\EE_1, \EE_2, \ldots, \EE_n$ respectively.


Let $\expect {X_j}$ denote the expectation of $X_j$ for $j \in \set {1, 2, \ldots, n}$.


Then we have, whenever both sides are defined:

$\ds \expect {\sum_{j \mathop = 1}^n X_j} = \sum_{j \mathop = 1}^n \expect {X_j}$


That is, the sum of the expectations equals the expectation of the sum.


Proof

The proof proceeds by induction on the number of terms $n$ in the sum.

For all $n \in \Z_{\ge 0}$, let $\map P n$ be the proposition:

$\ds \expect {\sum_{j \mathop = 1}^n X_j} = \sum_{j \mathop = 1}^n \expect {X_j}$


Basis for the Induction

$\map P 1$ is the case:

$\ds \expect {\sum_{j \mathop = 1}^1 X_j} = \sum_{j \mathop = 1}^1 \expect {X_j}$

That is:

$\expect {X_1} = \expect {X_1}$

which is tautologically true.


This is our basis for the induction.


Induction Hypothesis

Now it needs to be shown that if $\map P k$ is true, where $k \ge 1$, then it logically follows that $\map P {k + 1}$ is true.


So this is the induction hypothesis: $\ds \expect {\sum_{j \mathop = 1}^k X_j} = \sum_{j \mathop = 1}^k \expect {X_j}$


from which it is to be shown that: $\ds \expect {\sum_{j \mathop = 1}^{k + 1} X_j} = \sum_{j \mathop = 1}^{k + 1} \expect {X_j}$


Induction Step

This is the induction step:


Denote the random variable $Y = \ds \sum_{j \mathop = 1}^k X_j$.

Then we compute:

\(\ds \expect {\sum_{j \mathop = 1}^{k + 1} X_j}\) \(=\) \(\ds \expect {Y + X_{k + 1} }\)
\(\ds \) \(=\) \(\ds \expect Y + \expect {X_{k + 1} }\) Expectation is Linear
\(\ds \) \(=\) \(\ds \sum_{j \mathop = 1}^{k + 1} \expect {X_j}\) by Induction Hypothesis

The result follows by the Principle of Mathematical Induction.

$\blacksquare$