Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition
碩士 === 國立臺灣大學 === 電機工程學研究所 === 105 === Object recognition has remained an important research topic in computer vision for a long time. It plays a critical role in the area of robotics. Robots need to extract useful information from rich visual feedback and convert it to high-level semantic knowledge...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2017
|
Online Access: | http://ndltd.ncl.edu.tw/handle/aygafr |
id |
ndltd-TW-105NTU05442065 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-105NTU054420652019-05-15T23:39:39Z http://ndltd.ncl.edu.tw/handle/aygafr Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition 減緩卷積類神經網路之災難性失憶問題以有效達成物體辨識 Da-Fang Ke 柯達方 碩士 國立臺灣大學 電機工程學研究所 105 Object recognition has remained an important research topic in computer vision for a long time. It plays a critical role in the area of robotics. Robots need to extract useful information from rich visual feedback and convert it to high-level semantic knowledge, and hence can lead to intelligence. However, until the emerging of Convolutional Neural Networks (CNNs), the fundamental ability to recognize objects is still insufficient. Since Alex Krizhevsky successfully applied deep CNNs on large scale image classification, CNNs has been bringing lots of success in the community of computer vision. Yet, lots of practical concerns still need to be overcome to make intelligent systems truly useful. In this thesis, we focus on a practical issue which requires robots to be able to incrementally learn new objects. We first reason that an intelligent service robot working in a particular environment needs to recognize instances under different imaging conditions (scale, brightness, occlusion, etc.). Through the advanced deep learning method, we are able to train a reliable visual system given sufficient data. The issue, however, is that in the reality of beginning, a complete dataset that covers all instances to be learned and provides sufficient imaging conditions is unavailable. In practice, supervisors collect new data and train recognition systems repeatedly and incrementally. It is necessary to derive an incremental learning approach to meet this requirement. A direct solution would be to reuse of every past data along with new data to ensure performance. While this may be workable, it requires a reservoir of persistent training data for all learning stage, an assumption which may not always hold. To this end, we investigate instance recognition in continuous learning scenarios without the need to access previous data. Under the hood, we are investigating how to mitigate catastrophic forgetting. Catastrophic forgetting is a phenomenon which destroys previously learned knowledge when training Neural Networks on new data. In the thesis, we propose pseudorehearsal with imaging recollection and pseudo neurons to address the forgetting problem. Our approach can achieve a promising tradeoff between learning new knowledge and preserving old knowledge. We demonstrate the feasibility of our approach by experiments and comparison with other approaches. We also provide insights to understand our innovation by experimental analysis. Ren C. Luo 羅仁權 2017 學位論文 ; thesis 58 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 電機工程學研究所 === 105 === Object recognition has remained an important research topic in computer vision for a long time. It plays a critical role in the area of robotics. Robots need to extract useful information from rich visual feedback and convert it to high-level semantic knowledge, and hence can lead to intelligence. However, until the emerging of Convolutional Neural Networks (CNNs), the fundamental ability to recognize objects is still insufficient. Since Alex Krizhevsky successfully applied deep CNNs on large scale image classification, CNNs has been bringing lots of success in the community of computer vision. Yet, lots of practical concerns still need to be overcome to make intelligent systems truly useful.
In this thesis, we focus on a practical issue which requires robots to be able to incrementally learn new objects. We first reason that an intelligent service robot working in a particular environment needs to recognize instances under different imaging conditions (scale, brightness, occlusion, etc.). Through the advanced deep learning method, we are able to train a reliable visual system given sufficient data. The issue, however, is that in the reality of beginning, a complete dataset that covers all instances to be learned and provides sufficient imaging conditions is unavailable. In practice, supervisors collect new data and train recognition systems repeatedly and incrementally. It is necessary to derive an incremental learning approach to meet this requirement. A direct solution would be to reuse of every past data along with new data to ensure performance. While this may be workable, it requires a reservoir of persistent training data for all learning stage, an assumption which may not always hold. To this end, we investigate instance recognition in continuous learning scenarios without the need to access previous data. Under the hood, we are investigating how to mitigate catastrophic forgetting. Catastrophic forgetting is a phenomenon which destroys previously learned knowledge when training Neural Networks on new data. In the thesis, we propose pseudorehearsal with imaging recollection and pseudo neurons to address the forgetting problem. Our approach can achieve a promising tradeoff between learning new knowledge and preserving old knowledge. We demonstrate the feasibility of our approach by experiments and comparison with other approaches. We also provide insights to understand our innovation by experimental analysis.
|
author2 |
Ren C. Luo |
author_facet |
Ren C. Luo Da-Fang Ke 柯達方 |
author |
Da-Fang Ke 柯達方 |
spellingShingle |
Da-Fang Ke 柯達方 Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
author_sort |
Da-Fang Ke |
title |
Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
title_short |
Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
title_full |
Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
title_fullStr |
Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
title_full_unstemmed |
Mitigate Catastrophic Forgetting in Convolutional Neural Networks for Effective Instance Recognition |
title_sort |
mitigate catastrophic forgetting in convolutional neural networks for effective instance recognition |
publishDate |
2017 |
url |
http://ndltd.ncl.edu.tw/handle/aygafr |
work_keys_str_mv |
AT dafangke mitigatecatastrophicforgettinginconvolutionalneuralnetworksforeffectiveinstancerecognition AT kēdáfāng mitigatecatastrophicforgettinginconvolutionalneuralnetworksforeffectiveinstancerecognition AT dafangke jiǎnhuǎnjuǎnjīlèishénjīngwǎnglùzhīzāinánxìngshīyìwèntíyǐyǒuxiàodáchéngwùtǐbiànshí AT kēdáfāng jiǎnhuǎnjuǎnjīlèishénjīngwǎnglùzhīzāinánxìngshīyìwèntíyǐyǒuxiàodáchéngwùtǐbiànshí |
_version_ |
1719151884432834560 |