On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid
The Jensen−Shannon divergence is a renown bounded symmetrization of the Kullback−Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar <inline-formula> <math displ...
Main Author: | Frank Nielsen |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-02-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/2/221 |
Similar Items
-
A method for continuous-range sequence analysis with Jensen-Shannon divergence
by: Miguel Ángel Ré, et al.
Published: (2021-02-01) -
On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means
by: Frank Nielsen
Published: (2019-05-01) -
Information Geometric Approach on Most Informative Boolean Function Conjecture
by: Albert No
Published: (2018-09-01) -
JS-MA: A Jensen-Shannon Divergence Based Method for Mapping Genome-Wide Associations on Multiple Diseases
by: Xuan Guo
Published: (2020-10-01) -
RSSI Probability Density Functions Comparison Using Jensen-Shannon Divergence and Pearson Distribution
by: Antonios Lionis, et al.
Published: (2021-04-01)