Summary: | Asymptotic unbiasedness and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>L</mi><mn>2</mn></msup></semantics></math></inline-formula>-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi mathvariant="double-struck">R</mi><mi>d</mi></msup></semantics></math></inline-formula>, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain <i>k</i>-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.
|