Some bounds for skewed α-Jensen-Shannon divergence

Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them. In the process, we generalize the Bretagnolle-Huber inequality that offers an...

Full description

Bibliographic Details
Main Author: Takuya Yamano
Format: Article
Language:English
Published: Elsevier 2019-10-01
Series:Results in Applied Mathematics
Online Access:http://www.sciencedirect.com/science/article/pii/S2590037419300640