Definition:Differential Entropy
Jump to navigation
Jump to search
Definition
Differential entropy extends the concept of entropy to continuous random variables.
Let $X$ be a continuous random variable.
Let $X$ have probability density function $f_X$.
Then the differential entropy of $X$, $\map h X$ measured in nats, is given by:
- $\ds \map h X = -\int_{-\infty}^\infty \map {f_X} x \ln \map {f_X} x \rd x$
Where $\map {f_X} x = 0$, we take $\map {f_X} x \ln \map {f_X} x = 0$ by convention.
Also see
- Results about differential entropy can be found here.
Sources
![]() | There are no source works cited for this page. Source citations are highly desirable, and mandatory for all definition pages. Definition pages whose content is wholly or partly unsourced are in danger of having such content deleted. To discuss this page in more detail, feel free to use the talk page. |