Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation
Scene classification, aiming to identify the land-cover categories of remotely sensed image patches, is now a fundamental task in the remote sensing image analysis field. Deep-learning-model-based algorithms are widely applied in scene classification and achieve remarkable performance, but these hig...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-05-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | http://www.mdpi.com/2072-4292/10/5/719 |
id |
doaj-27234ad62ced4f2bbbceb8ce9b155ded |
---|---|
record_format |
Article |
spelling |
doaj-27234ad62ced4f2bbbceb8ce9b155ded2020-11-25T00:51:46ZengMDPI AGRemote Sensing2072-42922018-05-0110571910.3390/rs10050719rs10050719Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge DistillationGuanzhou Chen0Xiaodong Zhang1Xiaoliang Tan2Yufeng Cheng3Fan Dai4Kun Zhu5Yuanfu Gong6Qing Wang7State Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaState Key Laboratory of Information Engineering in Surveying Mapping and Remote Sensing, Wuhan University, Wuhan 430079, ChinaScene classification, aiming to identify the land-cover categories of remotely sensed image patches, is now a fundamental task in the remote sensing image analysis field. Deep-learning-model-based algorithms are widely applied in scene classification and achieve remarkable performance, but these high-level methods are computationally expensive and time-consuming. Consequently in this paper, we introduce a knowledge distillation framework, currently a mainstream model compression method, into remote sensing scene classification to improve the performance of smaller and shallower network models. Our knowledge distillation training method makes the high-temperature softmax output of a small and shallow student model match the large and deep teacher model. In our experiments, we evaluate knowledge distillation training method for remote sensing scene classification on four public datasets: AID dataset, UCMerced dataset, NWPU-RESISC dataset, and EuroSAT dataset. Results show that our proposed training method was effective and increased overall accuracy (3% in AID experiments, 5% in UCMerced experiments, 1% in NWPU-RESISC and EuroSAT experiments) for small and shallow models. We further explored the performance of the student model on small and unbalanced datasets. Our findings indicate that knowledge distillation can improve the performance of small network models on datasets with lower spatial resolution images, numerous categories, as well as fewer training samples.http://www.mdpi.com/2072-4292/10/5/719knowledge distillationscene classificationconvolutional neural networks (CNNs)remote sensingdeep learning |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Guanzhou Chen Xiaodong Zhang Xiaoliang Tan Yufeng Cheng Fan Dai Kun Zhu Yuanfu Gong Qing Wang |
spellingShingle |
Guanzhou Chen Xiaodong Zhang Xiaoliang Tan Yufeng Cheng Fan Dai Kun Zhu Yuanfu Gong Qing Wang Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation Remote Sensing knowledge distillation scene classification convolutional neural networks (CNNs) remote sensing deep learning |
author_facet |
Guanzhou Chen Xiaodong Zhang Xiaoliang Tan Yufeng Cheng Fan Dai Kun Zhu Yuanfu Gong Qing Wang |
author_sort |
Guanzhou Chen |
title |
Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation |
title_short |
Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation |
title_full |
Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation |
title_fullStr |
Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation |
title_full_unstemmed |
Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation |
title_sort |
training small networks for scene classification of remote sensing images via knowledge distillation |
publisher |
MDPI AG |
series |
Remote Sensing |
issn |
2072-4292 |
publishDate |
2018-05-01 |
description |
Scene classification, aiming to identify the land-cover categories of remotely sensed image patches, is now a fundamental task in the remote sensing image analysis field. Deep-learning-model-based algorithms are widely applied in scene classification and achieve remarkable performance, but these high-level methods are computationally expensive and time-consuming. Consequently in this paper, we introduce a knowledge distillation framework, currently a mainstream model compression method, into remote sensing scene classification to improve the performance of smaller and shallower network models. Our knowledge distillation training method makes the high-temperature softmax output of a small and shallow student model match the large and deep teacher model. In our experiments, we evaluate knowledge distillation training method for remote sensing scene classification on four public datasets: AID dataset, UCMerced dataset, NWPU-RESISC dataset, and EuroSAT dataset. Results show that our proposed training method was effective and increased overall accuracy (3% in AID experiments, 5% in UCMerced experiments, 1% in NWPU-RESISC and EuroSAT experiments) for small and shallow models. We further explored the performance of the student model on small and unbalanced datasets. Our findings indicate that knowledge distillation can improve the performance of small network models on datasets with lower spatial resolution images, numerous categories, as well as fewer training samples. |
topic |
knowledge distillation scene classification convolutional neural networks (CNNs) remote sensing deep learning |
url |
http://www.mdpi.com/2072-4292/10/5/719 |
work_keys_str_mv |
AT guanzhouchen trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT xiaodongzhang trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT xiaoliangtan trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT yufengcheng trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT fandai trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT kunzhu trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT yuanfugong trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation AT qingwang trainingsmallnetworksforsceneclassificationofremotesensingimagesviaknowledgedistillation |
_version_ |
1725243940762812416 |