Shannon differential entropy

Back to Entropy

The differential entropy of a continuous random variable ''X'' is defined by

''H(X) = -int_{RR^d} f(x) log(f(x)) dx,''

where ''f : RR^d -> RR'' is the probability density function of ''X''.

Entropy vs differential entropy

There is no generalization of Shannon entropy to continuous variables, since differential entropy is not an entropy combination. Therefore, while the definition of differential entropy looks similar to the definition of discrete entropy, it does not generalize its properties. The role of differential entropy is two-fold. First, it is a syntactic device for describing other information theoretic concepts which are defined as combinations of differential entropies. Second, when transforming data to minimize mutual information, it is equivalent to minimize differential entropies, which can be a bit more efficient. This makes the estimation of differential entropy by itself useful.

Differential entropy on differentiable manifolds

The definition of differential entropy can be generalized by assuming that ''X'' is distributed on an ''m''-dimensional differentiable manifold ''M'' in ''RR^d'', ''0 <= m <= d'', with a probability density function ''f : M -> RR''. Then differential entropy is defined by:

''H(X) = -int_{M} f(x) log(f(x)) dx''

TIM implements estimators for both of these definitions.

Learn more

Kozachenko-Leonenko estimator

Nilsson-Kleijn estimator

Stowell-Plumbley estimator

Files

An aggregate file for differential entropy.

differential_entropy.h

Analytical solutions for differential entropies

differential_entropy_analytic.cpp

differential_entropy_analytic.h