Improving Convolutional Neural Networks’ Accuracy in Noisy Environments Using k-Nearest Neighbors

We present a hybrid approach to improve the accuracy of Convolutional Neural Networks (CNN) without retraining the model. The proposed architecture replaces the softmax layer by a k-Nearest Neighbor (kNN) algorithm for inference. Although this is a common technique in transfer learning, we apply it...

Full description

Bibliographic Details
Main Authors: Antonio-Javier Gallego, Antonio Pertusa, Jorge Calvo-Zaragoza
Format: Article
Language:English
Published: MDPI AG 2018-10-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/8/11/2086
Description
Summary:We present a hybrid approach to improve the accuracy of Convolutional Neural Networks (CNN) without retraining the model. The proposed architecture replaces the softmax layer by a k-Nearest Neighbor (kNN) algorithm for inference. Although this is a common technique in transfer learning, we apply it to the same domain for which the network was trained. Previous works show that neural codes (neuron activations of the last hidden layers) can benefit from the inclusion of classifiers such as support vector machines or random forests. In this work, our proposed hybrid CNN + kNN architecture is evaluated using several image datasets, network topologies and label noise levels. The results show significant accuracy improvements in the inference stage with respect to the standard CNN with noisy labels, especially with relatively large datasets such as CIFAR100. We also verify that applying the <inline-formula> <math display="inline"> <semantics> <msub> <mi>ℓ</mi> <mn>2</mn> </msub> </semantics> </math> </inline-formula> norm on neural codes is statistically beneficial for this approach.
ISSN:2076-3417