Uncertainty Function satisfies Axioms of Uncertainty
Jump to navigation
Jump to search
Theorem
Let $X$ be a random variable.
Let $X$ take a finite number of values with probabilities $p_1, p_2, \dotsc, p_n$.
Let $\map H X$ be the uncertainty function of $X$:
- $\map H X = \ds -\sum_k p_k \lg p_k$
where:
- $\lg$ denotes logarithm base $2$
- the summation is over those $k$ where $p_k > 0$.
Then the uncertainty function satisfies the Axioms of Uncertainty.
Proof
![]() | This theorem requires a proof. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by crafting such a proof. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{ProofWanted}} from the code.If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page. |
Sources
- 1988: Dominic Welsh: Codes and Cryptography ... (previous) ... (next): $\S 1$: Entropy = uncertainty = information: $1.1$ Uncertainty: Exercises $1.1$: $2.$