Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks
In this paper, we examine two strategies for boosting the performance of ensembles of Siamese networks (SNNs) for image classification using two loss functions (Triplet and Binary Cross Entropy) and two methods for building the dissimilarity spaces (FULLY and DEEPER). With FULLY, the distance betwee...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-08-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/21/17/5809 |
id |
doaj-31257fc1062540cdb3d7b1ce2a491c24 |
---|---|
record_format |
Article |
spelling |
doaj-31257fc1062540cdb3d7b1ce2a491c242021-09-09T13:56:19ZengMDPI AGSensors1424-82202021-08-01215809580910.3390/s21175809Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural NetworksLoris Nanni0Giovanni Minchio1Sheryl Brahnam2Davide Sarraggiotto3Alessandra Lumini4Department of Information Engineering (DEI), University of Padova, 35131 Padova, ItalyDepartment of Information Engineering (DEI), University of Padova, 35131 Padova, ItalyDepartment of Information Technology and Cybersecurity, Missouri State University, 901 S, National Street, Springfield, MO 65804, USADepartment of Information Engineering (DEI), University of Padova, 35131 Padova, ItalyDepartment of Computer Science and Engineering (DISI), University of Bologna, Via dell’Università 50, 47521 Cesena, ItalyIn this paper, we examine two strategies for boosting the performance of ensembles of Siamese networks (SNNs) for image classification using two loss functions (Triplet and Binary Cross Entropy) and two methods for building the dissimilarity spaces (FULLY and DEEPER). With FULLY, the distance between a pattern and a prototype is calculated by comparing two images using the fully connected layer of the Siamese network. With DEEPER, each pattern is described using a deeper layer combined with dimensionality reduction. The basic design of the SNNs takes advantage of supervised k-means clustering for building the dissimilarity spaces that train a set of support vector machines, which are then combined by sum rule for a final decision. The robustness and versatility of this approach are demonstrated on several cross-domain image data sets, including a portrait data set, two bioimage and two animal vocalization data sets. Results show that the strategies employed in this work to increase the performance of dissimilarity image classification using SNN are closing the gap with standalone CNNs. Moreover, when our best system is combined with an ensemble of CNNs, the resulting performance is superior to an ensemble of CNNs, demonstrating that our new strategy is extracting additional information.https://www.mdpi.com/1424-8220/21/17/5809Siamese networksensemble of classifiersloss functiondiscrete cosine transform |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Loris Nanni Giovanni Minchio Sheryl Brahnam Davide Sarraggiotto Alessandra Lumini |
spellingShingle |
Loris Nanni Giovanni Minchio Sheryl Brahnam Davide Sarraggiotto Alessandra Lumini Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks Sensors Siamese networks ensemble of classifiers loss function discrete cosine transform |
author_facet |
Loris Nanni Giovanni Minchio Sheryl Brahnam Davide Sarraggiotto Alessandra Lumini |
author_sort |
Loris Nanni |
title |
Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks |
title_short |
Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks |
title_full |
Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks |
title_fullStr |
Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks |
title_full_unstemmed |
Closing the Performance Gap between Siamese Networks for Dissimilarity Image Classification and Convolutional Neural Networks |
title_sort |
closing the performance gap between siamese networks for dissimilarity image classification and convolutional neural networks |
publisher |
MDPI AG |
series |
Sensors |
issn |
1424-8220 |
publishDate |
2021-08-01 |
description |
In this paper, we examine two strategies for boosting the performance of ensembles of Siamese networks (SNNs) for image classification using two loss functions (Triplet and Binary Cross Entropy) and two methods for building the dissimilarity spaces (FULLY and DEEPER). With FULLY, the distance between a pattern and a prototype is calculated by comparing two images using the fully connected layer of the Siamese network. With DEEPER, each pattern is described using a deeper layer combined with dimensionality reduction. The basic design of the SNNs takes advantage of supervised k-means clustering for building the dissimilarity spaces that train a set of support vector machines, which are then combined by sum rule for a final decision. The robustness and versatility of this approach are demonstrated on several cross-domain image data sets, including a portrait data set, two bioimage and two animal vocalization data sets. Results show that the strategies employed in this work to increase the performance of dissimilarity image classification using SNN are closing the gap with standalone CNNs. Moreover, when our best system is combined with an ensemble of CNNs, the resulting performance is superior to an ensemble of CNNs, demonstrating that our new strategy is extracting additional information. |
topic |
Siamese networks ensemble of classifiers loss function discrete cosine transform |
url |
https://www.mdpi.com/1424-8220/21/17/5809 |
work_keys_str_mv |
AT lorisnanni closingtheperformancegapbetweensiamesenetworksfordissimilarityimageclassificationandconvolutionalneuralnetworks AT giovanniminchio closingtheperformancegapbetweensiamesenetworksfordissimilarityimageclassificationandconvolutionalneuralnetworks AT sherylbrahnam closingtheperformancegapbetweensiamesenetworksfordissimilarityimageclassificationandconvolutionalneuralnetworks AT davidesarraggiotto closingtheperformancegapbetweensiamesenetworksfordissimilarityimageclassificationandconvolutionalneuralnetworks AT alessandralumini closingtheperformancegapbetweensiamesenetworksfordissimilarityimageclassificationandconvolutionalneuralnetworks |
_version_ |
1717759385562775552 |