EE<i>k</i>NN: <i>k</i>-Nearest Neighbor Classifier with an Evidential Editing Procedure for Training Samples

The <i>k</i>-nearest neighbor (<i>k</i>NN) rule is one of the most popular classification algorithms applied in many fields because it is very simple to understand and easy to design. However, one of the major problems encountered in using the <i>k</i>NN rule is t...

Full description

Bibliographic Details
Main Authors: Lianmeng Jiao, Xiaojiao Geng, Quan Pan
Format: Article
Language:English
Published: MDPI AG 2019-05-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/8/5/592
Description
Summary:The <i>k</i>-nearest neighbor (<i>k</i>NN) rule is one of the most popular classification algorithms applied in many fields because it is very simple to understand and easy to design. However, one of the major problems encountered in using the <i>k</i>NN rule is that all of the training samples are considered equally important in the assignment of the class label to the query pattern. In this paper, an evidential editing version of the <i>k</i>NN rule is developed within the framework of belief function theory. The proposal is composed of two procedures. An evidential editing procedure is first proposed to reassign the original training samples with new labels represented by an evidential membership structure, which provides a general representation model regarding the class membership of the training samples. After editing, a classification procedure specifically designed for evidently edited training samples is developed in the belief function framework to handle the more general situation in which the edited training samples are assigned dependent evidential labels. Three synthetic datasets and six real datasets collected from various fields were used to evaluate the performance of the proposed method. The reported results show that the proposal achieves better performance than other considered <i>k</i>NN-based methods, especially for datasets with high imprecision ratios.
ISSN:2079-9292