Entropy combinations

Back to Estimators

An entropy combination of a random variable ''X'' is defined by

''C(X) = sum_{k = 1}^m c_k H(X_{L_k})''

where

It can be shown that the last condition is a necessary and sufficient condition for a discrete entropy combination to converge to a continuous entropy combination under a shrinking tiling of ''RR^d''.

Practice

The implementation of the entropy combination estimator requires additionally that each ''L_k'' is an interval. This allows to save memory by allowing the memory for the joint signal to be shared with the marginal signals. It is easily seen that mutual information, partial mutual information, transfer entropy, and partial transfer entropy can all be arranged to have this property.

Learn more

Mutual information

Partial mutual information

Partial transfer entropy

Transfer entropy

Files

Entropy combination estimation

entropy_combination.m

Estimation of entropy combinations

entropy_combination.h

entropy_combination.hpp

Temporal entropy combination estimation

entropy_combination_t.m

Temporal estimation of entropy combinations

entropy_combination_t.h

entropy_combination_t.hpp

entropy_combination

matlab_entropy_combination.cpp

entropy_combination_t

matlab_entropy_combination_t.cpp