Statistical Estimation of the Kullback–Leibler Divergence

Asymptotic unbiasedness and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>L</mi><mn>2</mn></msup></semantics></math></inline-formula>-consistency are e...

Full description

Bibliographic Details
Main Authors: Alexander Bulinski, Denis Dimitrov
Format: Article
Language:English
Published: MDPI AG 2021-03-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/9/5/544
Description
Summary:Asymptotic unbiasedness and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>L</mi><mn>2</mn></msup></semantics></math></inline-formula>-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi mathvariant="double-struck">R</mi><mi>d</mi></msup></semantics></math></inline-formula>, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain <i>k</i>-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.
ISSN:2227-7390