Variance of Bernoulli Distribution
Theorem
Let $X$ be a discrete random variable with the Bernoulli distribution with parameter $p$:
- $X \sim \Bernoulli p$
Then the variance of $X$ is given by:
- $\var X = p \paren {1 - p}$
Proof 1
From the definition of variance:
- $\var X = \expect {\paren {X - \expect X}^2}$
From the Expectation of Bernoulli Distribution, we have $\expect X = p$.
Then by definition of Bernoulli distribution:
\(\ds \expect {\paren {X - \expect X}^2}\) | \(=\) | \(\ds \paren {1 - p}^2 \times p + \paren {0 - p}^2 \times \paren {1 - p}\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p - 2 p^2 + p^3 + p^2 - p^3\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p - p^2\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p \paren {1 - p}\) |
$\blacksquare$
Proof 2
From Variance as Expectation of Square minus Square of Expectation:
- $\var X = \expect {X^2} - \paren {\expect X}^2$
From Expectation of Function of Discrete Random Variable:
- $\ds \expect {X^2} = \sum_{x \mathop \in \Img X} x^2 \, \map \Pr {X = x}$
So:
\(\ds \expect {X^2}\) | \(=\) | \(\ds 1^2 \times p + 0^2 \times \paren {1 - p}\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p\) |
Then:
\(\ds \var X\) | \(=\) | \(\ds \expect {X^2} - \paren {\expect X}^2\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p - p^2\) | Expectation of Bernoulli Distribution | |||||||||||
\(\ds \) | \(=\) | \(\ds p \paren {1 - p}\) |
$\blacksquare$
Proof 3
We can simply use the Variance of Binomial Distribution, putting $n = 1$.
$\blacksquare$
Proof 4
From Variance of Discrete Random Variable from PGF, we have:
- $\var X = \map { {\Pi_X}''} 1 + \mu - \mu^2$
where $\mu = \expect X$ is the expectation of $X$.
From the Probability Generating Function of Bernoulli Distribution, we have:
- $\map {\Pi_X} s = q + p s$
where $q = 1 - p$.
From Expectation of Bernoulli Distribution, we have $\mu = p$.
We have $\map { {\Pi_X}''} s = 0$ from Derivatives of PGF of Bernoulli Distribution.
Hence:
- $\var X = 0 - \mu - \mu^2 = p - p^2 = p \paren {1 - p}$
$\blacksquare$
Proof 5
From Moment Generating Function of Bernoulli Distribution, the moment generating function $M_X$ of $X$ is given by:
- $\map {M_X} t = q + p e^t$
From Variance as Expectation of Square minus Square of Expectation, we have:
- $\var X = \expect {X^2} - \paren {\expect X}^2$
From Moment in terms of Moment Generating Function:
- $\expect {X^2} = \map {M_X''} 0$
We have:
\(\ds \map {M_X''} t\) | \(=\) | \(\ds \frac {\d^2} {\d t^2} \paren {q + p e^t}\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p \frac \d {\d t} \paren {e^t}\) | Derivative of Constant, Derivative of Exponential Function | |||||||||||
\(\ds \) | \(=\) | \(\ds p e^t\) | Derivative of Exponential Function |
Setting $t = 0$ gives:
\(\ds \expect {X^2}\) | \(=\) | \(\ds p e^0\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p\) | Exponential of Zero |
In Expectation of Bernoulli Distribution, it is shown that:
- $\expect X = p$
So:
\(\ds \var X\) | \(=\) | \(\ds p - p^2\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds p \paren {1 - p}\) |
$\blacksquare$