Divergence

Back to Estimators

The Kullback-Leibler divergence between two random variables ''X'' and ''Y'' is defined by

''D(X, Y) = int_{R^d} f(x) text(log)((f(x)) / (g(x))) dx,''

where ''f, g: RR^d -> RR'' are the probability density functions of ''X'' and ''Y'', respectively. In TIM, the Kullback-Leibler divergence is abbreviated a divergence.

Learn more

Wang-Kulkarni-Verdu estimator