Renyi entropy

Back to Entropy

The Renyi entropy of a continuous random variable ''X'' is defined by

''H_q(X) = (1 / (1 - q)) log(int_{RR^n} f^q(x) dx),''

where ''f : RR^n -> RR'' is the probability density function of ''X'', and ''q in RR''. As ''q'' approaches 1, the Renyi entropy approaches the Shannon differential entropy. This limit can be made part of the definition to make Renyi entropy continuous on ''q''.

Learn more

Analytic solutions for Renyi entropies

Leonenko-Pronzato-Savani estimator

Files

An aggregate file for Renyi entropy.

renyi_entropy.h