Hierarchical ART Neural Networks for Character Recognition

碩士 === 逢甲大學 === 機械工程學所 === 97 === In this article, we design simplified ART neural networks based on the concept of In-star and Out-star loop in store and retrieve patterns, and include the simplified ART neural networks in the form of decision tree to make a hierarchical classifier. We investigate...

Full description

Bibliographic Details
Main Authors: Chao-Ying Su, 蘇昭穎
Other Authors: Chao Yin Hsiao
Format: Others
Language:zh-TW
Published: 2009
Online Access:http://ndltd.ncl.edu.tw/handle/54473984208344675798
Description
Summary:碩士 === 逢甲大學 === 機械工程學所 === 97 === In this article, we design simplified ART neural networks based on the concept of In-star and Out-star loop in store and retrieve patterns, and include the simplified ART neural networks in the form of decision tree to make a hierarchical classifier. We investigate the characteristics of the classifier, and design the algorithms for storing and testing patterns. To each pattern of word characters, the binary parameters are adopted, and then the related feature parameters are taken. The bottom nodes of the decision tree are implemented with the simplified ART neural networks, those are designed to work on the binary pattern vectors; On the other hand, the non-bottom nodes of the decision tree are classifiers of vector quantization work on the feature parameters for clusters classification. By doing this we can highly reduce the computation requirement for training and decision making. For the simplified ART neural networks of the bottom nodes of the decision tree, an algorithm of pattern recognition, pattern storage and retrieve is designed, also an adjusting method is included to handle the problem of stability-plasticity dilemma, in the way that the clean and confirmed patterns can be stored efficiently, while the patterns with different degrees of distortion or noise corrupted can be store separately. About the experiments, 26 alphanumeric characters and 10 Arabic numerals in the phone of time new Roman are adopted. The experiments are divided into three stages, the first stage transfer and save each of the 36 characters into binary pattern. The second stage we transfer and save each of the binary patterns of the 36 characters into 7 feature parameters. Finally, in the last stage, we adopt the hierarchical ART neural networks, train the networks and use it to recognize the tested character patterns those may or may not included in the 36 trained patterns.