Divergence Measure of Belief Function and Its Application in Data Fusion

Divergence measure is widely used in many applications. To efficiently deal with uncertainty in real applications, basic probability assignment (BPA) in Dempster-Shafer evidence theory, instead of probability distribution, is adopted. As a result, an open issue is that how to measure the divergence...

Full description

Bibliographic Details
Main Authors: Yutong Song, Yong Deng
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8784148/
Description
Summary:Divergence measure is widely used in many applications. To efficiently deal with uncertainty in real applications, basic probability assignment (BPA) in Dempster-Shafer evidence theory, instead of probability distribution, is adopted. As a result, an open issue is that how to measure the divergence of BPA. In this paper, a new divergence measure of two BPAs is proposed. The proposed divergence measure is the generalization of Kullback-Leibler divergence since when the BPA is degenerated as probability distribution, the proposed belief divergence is equal to Kullback-Leibler divergence. Furthermore, compared with existing belief divergence measure, the new method has a better performance under the situation with a great degree of uncertainty and ambiguity. Numerical examples are used to illustrate the efficiency of the proposed divergence measure. In addition, based on the proposed belief divergence measure, a combination model is proposed to address data fusion. Finally, an example in target recognition is shown to illustrate the advantage of the new belief divergence in handling not only extreme uncertainty, but also highly conflicting data.
ISSN:2169-3536