# Definition:Entropy of Finite Partition

Jump to navigation
Jump to search

## Definition

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $\xi$ be a finite partition of $\Omega$.

The **entropy** of $\xi$ is defined as:

- $\ds \map H \xi := \sum_{A \mathop \in \xi} \map \phi {\map \Pr A}$

where $\phi : \closedint 0 1 \to \R _{\ge 0}$ is defined by:

- $\map \phi x := \begin {cases}

0 & : x = 0 \\ -x \map \ln x & : x \in \hointl 0 1 \end {cases}$

Here $\ln$ denotes the natural logarithm.

This article, or a section of it, needs explaining.In particular: We already have Definition:Entropy (Probability Theory), which as far as I can tell is exactly the same concept as this one. A finite partition is a discrete random variable with frills on it. However, the base of the logarithm on that one is $2$ not $e$. Can this be reconciled? Oh yeah and we also have Definition:Uncertainty which is the same again.You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by explaining it.To discuss this page in more detail, feel free to use the talk page.When this work has been completed, you may remove this instance of `{{Explain}}` from the code. |

## Also see

## Sources

- 2013: Peter Walters:
*An Introduction to Ergodic Theory*(4th ed.) $4.2$: Entropy of a Partition