Definition:Uncertainty

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ be a discrete random variable.

Let $X$ take a finite number of values with probabilities $p_1, p_2, \dotsc, p_n$.


The uncertainty of $X$ is defined as:

$\map H X = \ds -\sum_k p_k \lg p_k$

where:

$\lg$ denotes logarithm base $2$
the summation is over those $k$ where $p_k > 0$.


Units

The unit of measurement used to quantify uncertainty is the bit.


Also defined as

In their definition of uncertainty, some sources do not specify the condition that the summation is over those $k$ where $p_k > 0$, but instead define $0 \lg 0 = 0$.

This is justified as:

$\ds \lim_{x \mathop \to 0^+} x \lg x = 0$

from Limit of Power of $x$ by Absolute Value of Power of Logarithm of $x$: Corollary.


Sources of Uncertainty

There are two main sources of uncertainty:


Uncertainty caused by Randomness

Uncertainty caused by randomness is caused by physical conditions that cannot be controlled.


Uncertainty caused by Incomplete Knowledge

Uncertainty caused by incomplete knowledge is caused by physical conditions that are not fully understood.


Also known as

The uncertainty of a random variable is also known as its entropy.


Examples

Example 1

Let $R_1$ and $R_2$ be horseraces.


Let $R_1$ have $7$ runners:

$3$ of which each have probability $\dfrac 1 6$ of winning
$4$ of which each have probability $\dfrac 1 8$ of winning.


Let $R_2$ have $8$ runners:

$2$ of which each have probability $\dfrac 1 4$ of winning
$6$ of which each have probability $\dfrac 1 {12}$ of winning.


Then $R_1$ and $R_2$ have equal uncertainty.


Also see

  • Results about uncertainty can be found here.


Sources