Sum of Variances of Independent Trials

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\EE_1, \EE_2, \ldots, \EE_n$ be a sequence of experiments whose outcomes are independent of each other.

Let $X_1, X_2, \ldots, X_n$ be discrete random variables on $\EE_1, \EE_2, \ldots, \EE_n$ respectively.


Let $\var {X_j}$ be the variance of $X_j$ for $j \in \set {1, 2, \ldots, n}$.


Then:

$\ds \var {\sum_{j \mathop = 1}^n X_j} = \sum_{j \mathop = 1}^n \var {X_j}$


That is, the sum of the variances equals the variance of the sum.


Proof

\(\ds \var {\sum_{j \mathop = 1}^n X_j}\) \(=\) \(\ds \expect {\paren {\sum_{j \mathop = 1}^n X_j}^2} - \expect {\sum_{j \mathop = 1}^n X_j}^2\) Variance as Expectation of Square minus Square of Expectation/Discrete
\(\ds \) \(=\) \(\ds \expect {\sum_{0 \mathop < i \mathop < j \mathop \le n} 2 X_i X_j + \sum_{j \mathop = 1}^n X_j^2} - 2 \sum_{0 \mathop < i \mathop < j \mathop \le n} \expect {X_i} \expect {X_j} - \sum_{j \mathop = 1}^n \expect{X_j}^2\)
\(\ds \) \(=\) \(\ds \sum_{0 \mathop < i \mathop < j \mathop \le n} 2 \expect {X_i X_j} + \sum_{j \mathop = 1}^n \expect {X_j^2} - 2 \sum_{0 \mathop < i \mathop < j \mathop \le n} \expect {X_i} \expect {X_j} - \sum_{j \mathop = 1}^n \expect {X_j}^2\) Expectation is Linear: Discrete
\(\ds \) \(=\) \(\ds 2 \sum_{0 \mathop < i \mathop < j \mathop \le n} \expect {X_i} \expect {X_j} + \sum_{j \mathop = 1}^n \expect {X_j^2} - 2 \sum_{0 \mathop < i \mathop < j \mathop \le n} \expect {X_i} \expect {X_j} - \sum_ {j \mathop = 1}^n \expect {X_j}^2\) Condition for Independence from Product of Expectations/Corollary
\(\ds \) \(=\) \(\ds \sum_{j \mathop = 1}^n \expect {X_j^2} - \sum_{j \mathop = 1}^n \expect {X_j}^2\)
\(\ds \) \(=\) \(\ds \sum_{j \mathop = 1}^n \expect {X_j^2} - \expect {X_j}^2\)
\(\ds \) \(=\) \(\ds \sum_{j \mathop = 1}^n \var {X_j}\) Variance as Expectation of Square minus Square of Expectation/Discrete

$\blacksquare$