Hierarchical ART Neural Networks for Character Recognition

碩士 === 逢甲大學 === 機械工程學所 === 97 === In this article, we design simplified ART neural networks based on the concept of In-star and Out-star loop in store and retrieve patterns, and include the simplified ART neural networks in the form of decision tree to make a hierarchical classifier. We investigate...

Full description

Bibliographic Details
Main Authors: Chao-Ying Su, 蘇昭穎
Other Authors: Chao Yin Hsiao
Format: Others
Language:zh-TW
Published: 2009
Online Access:http://ndltd.ncl.edu.tw/handle/54473984208344675798
id ndltd-TW-097FCU05489018
record_format oai_dc
spelling ndltd-TW-097FCU054890182015-11-13T04:09:17Z http://ndltd.ncl.edu.tw/handle/54473984208344675798 Hierarchical ART Neural Networks for Character Recognition 使用階層式適應性共振理論類神經網路於字型辨識 Chao-Ying Su 蘇昭穎 碩士 逢甲大學 機械工程學所 97 In this article, we design simplified ART neural networks based on the concept of In-star and Out-star loop in store and retrieve patterns, and include the simplified ART neural networks in the form of decision tree to make a hierarchical classifier. We investigate the characteristics of the classifier, and design the algorithms for storing and testing patterns. To each pattern of word characters, the binary parameters are adopted, and then the related feature parameters are taken. The bottom nodes of the decision tree are implemented with the simplified ART neural networks, those are designed to work on the binary pattern vectors; On the other hand, the non-bottom nodes of the decision tree are classifiers of vector quantization work on the feature parameters for clusters classification. By doing this we can highly reduce the computation requirement for training and decision making. For the simplified ART neural networks of the bottom nodes of the decision tree, an algorithm of pattern recognition, pattern storage and retrieve is designed, also an adjusting method is included to handle the problem of stability-plasticity dilemma, in the way that the clean and confirmed patterns can be stored efficiently, while the patterns with different degrees of distortion or noise corrupted can be store separately. About the experiments, 26 alphanumeric characters and 10 Arabic numerals in the phone of time new Roman are adopted. The experiments are divided into three stages, the first stage transfer and save each of the 36 characters into binary pattern. The second stage we transfer and save each of the binary patterns of the 36 characters into 7 feature parameters. Finally, in the last stage, we adopt the hierarchical ART neural networks, train the networks and use it to recognize the tested character patterns those may or may not included in the 36 trained patterns. Chao Yin Hsiao 蕭肇殷 2009 學位論文 ; thesis 72 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 逢甲大學 === 機械工程學所 === 97 === In this article, we design simplified ART neural networks based on the concept of In-star and Out-star loop in store and retrieve patterns, and include the simplified ART neural networks in the form of decision tree to make a hierarchical classifier. We investigate the characteristics of the classifier, and design the algorithms for storing and testing patterns. To each pattern of word characters, the binary parameters are adopted, and then the related feature parameters are taken. The bottom nodes of the decision tree are implemented with the simplified ART neural networks, those are designed to work on the binary pattern vectors; On the other hand, the non-bottom nodes of the decision tree are classifiers of vector quantization work on the feature parameters for clusters classification. By doing this we can highly reduce the computation requirement for training and decision making. For the simplified ART neural networks of the bottom nodes of the decision tree, an algorithm of pattern recognition, pattern storage and retrieve is designed, also an adjusting method is included to handle the problem of stability-plasticity dilemma, in the way that the clean and confirmed patterns can be stored efficiently, while the patterns with different degrees of distortion or noise corrupted can be store separately. About the experiments, 26 alphanumeric characters and 10 Arabic numerals in the phone of time new Roman are adopted. The experiments are divided into three stages, the first stage transfer and save each of the 36 characters into binary pattern. The second stage we transfer and save each of the binary patterns of the 36 characters into 7 feature parameters. Finally, in the last stage, we adopt the hierarchical ART neural networks, train the networks and use it to recognize the tested character patterns those may or may not included in the 36 trained patterns.
author2 Chao Yin Hsiao
author_facet Chao Yin Hsiao
Chao-Ying Su
蘇昭穎
author Chao-Ying Su
蘇昭穎
spellingShingle Chao-Ying Su
蘇昭穎
Hierarchical ART Neural Networks for Character Recognition
author_sort Chao-Ying Su
title Hierarchical ART Neural Networks for Character Recognition
title_short Hierarchical ART Neural Networks for Character Recognition
title_full Hierarchical ART Neural Networks for Character Recognition
title_fullStr Hierarchical ART Neural Networks for Character Recognition
title_full_unstemmed Hierarchical ART Neural Networks for Character Recognition
title_sort hierarchical art neural networks for character recognition
publishDate 2009
url http://ndltd.ncl.edu.tw/handle/54473984208344675798
work_keys_str_mv AT chaoyingsu hierarchicalartneuralnetworksforcharacterrecognition
AT sūzhāoyǐng hierarchicalartneuralnetworksforcharacterrecognition
AT chaoyingsu shǐyòngjiēcéngshìshìyīngxìnggòngzhènlǐlùnlèishénjīngwǎnglùyúzìxíngbiànshí
AT sūzhāoyǐng shǐyòngjiēcéngshìshìyīngxìnggòngzhènlǐlùnlèishénjīngwǎnglùyúzìxíngbiànshí
_version_ 1718129952911523840