Variance of Poisson Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete random variable with the Poisson distribution with parameter $\lambda$.


Then the variance of $X$ is given by:

$\operatorname {var}\, \left({X}\right) = \lambda$


Proof 1

From the definition of Variance as Expectation of Square minus Square of Expectation:

$\operatorname {var}\, \left({X}\right) = E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2$

From Expectation of Function of Discrete Random Variable:

$\displaystyle E \left({X^2}\right) = \sum_{x \mathop \in \Omega_X} x^2 \Pr \left({X = x}\right)$


So:

\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle E \left({X^2}\right)\) \(=\) \(\displaystyle \) \(\displaystyle \sum_{k \mathop \ge 0} {k^2 \dfrac 1 {k!} \lambda^k e^{-\lambda} }\) \(\displaystyle \) \(\displaystyle \)          Definition of Poisson distribution          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda e^{-\lambda} \sum_{k \mathop \ge 1} {k \dfrac 1 {\left({k-1}\right)!} \lambda^{k-1} }\) \(\displaystyle \) \(\displaystyle \)          Note change of limit: term is zero when $k=0$          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda e^{-\lambda} \left({\sum_{k \mathop \ge 1} {\left({k-1}\right) \dfrac 1 {\left({k-1}\right)!} \lambda^{k-1} } + \sum_{k \mathop \ge 1} {\frac 1 {\left({k-1}\right)!} \lambda^{k-1} } }\right)\) \(\displaystyle \) \(\displaystyle \)          straightforward algebra          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda e^{-\lambda} \left({\lambda \sum_{k \mathop \ge 2} {\dfrac 1 {\left({k-2}\right)!} \lambda^{k-2} } + \sum_{k \mathop \ge 1} {\dfrac 1 {\left({k-1}\right)!} \lambda^{k-1} } }\right)\) \(\displaystyle \) \(\displaystyle \)          Again, note change of limit: term is zero when $k-1=0$          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda e^{-\lambda} \left({\lambda \sum_{i \mathop \ge 0} {\dfrac 1 {i!} \lambda^i} + \sum_{j \mathop \ge 0} {\dfrac 1 {j!} \lambda^j} }\right)\) \(\displaystyle \) \(\displaystyle \)          putting $i = k-2, j = k-1$          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda e^{-\lambda} \left({\lambda e^\lambda + e^\lambda}\right)\) \(\displaystyle \) \(\displaystyle \)          Taylor Series Expansion for Exponential Function          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda \left({\lambda + 1}\right)\) \(\displaystyle \) \(\displaystyle \)                    
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda^2 + \lambda\) \(\displaystyle \) \(\displaystyle \)                    


Then:

\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \operatorname {var}\, \left({X}\right)\) \(=\) \(\displaystyle \) \(\displaystyle E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2\) \(\displaystyle \) \(\displaystyle \)                    
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda^2 + \lambda - \lambda^2\) \(\displaystyle \) \(\displaystyle \)          Expectation of Poisson Distribution: $E \left({X}\right) = \lambda$          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle \lambda\) \(\displaystyle \) \(\displaystyle \)                    

$\blacksquare$


Proof 2

From Variance of Discrete Random Variable from PGF, we have:

$\operatorname {var} \left({X}\right) = \Pi''_X \left({1}\right) + \mu - \mu^2$

where $\mu = E \left({x}\right)$ is the expectation of $X$.


From the Probability Generating Function of Poisson Distribution, we have:

$\Pi_X \left({s}\right) = e^{-\lambda \left({1-s}\right)}$


From Expectation of Poisson Distribution, we have:

$\mu = \lambda$


From Derivatives of PGF of Poisson Distribution, we have:

$\Pi''_X \left({s}\right) = \lambda^2 e^{- \lambda \left({1-s}\right)}$


Putting $s = 1$ using the formula $\Pi''_X \left({1}\right) + \mu - \mu^2$:

$\operatorname {var} \left({X}\right) = \lambda^2 e^{- \lambda \left({1-1}\right)} + \lambda - \lambda^2$

and hence the result.

$\blacksquare$


Comment

The interesting thing about the Poisson distribution is that its expectation and its variance are both equal to its parameter $\lambda$.


Sources