2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory
碩士 === 國立成功大學 === 電機工程研究所 === 82 === In this paper, a system included image preprocessing and neural networks is proposed. The various function units of the image processing are used to obtain an invariant image representation in the beginn...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
1994
|
Online Access: | http://ndltd.ncl.edu.tw/handle/47410554866730966852 |
id |
ndltd-TW-082NCKU0442089 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-082NCKU04420892015-10-13T15:36:51Z http://ndltd.ncl.edu.tw/handle/47410554866730966852 2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory 加強式反向傳播網路之二維圖形辨識系統 Jau-Ling Shih 石昭玲 碩士 國立成功大學 電機工程研究所 82 In this paper, a system included image preprocessing and neural networks is proposed. The various function units of the image processing are used to obtain an invariant image representation in the beginning of the system. The space of the neural networks weights can be reduced by using the reduction of the feature dimension before the preprocessed feature applied to the networks.Then, several kinds of the neural models are proposed for pattern recognition : (1) distributed associative memory (DAM), (2) backpropagation network(BPN), (3)DAM combined with BPN, and(4)BPN with the associative memory as initial weights. In the case of (3), this hierarchical networks consist of two levels of neural networks. In the low level, a DAM receives the output vectors of image preprocessing functions to create a system which recognizes pattern regardless of changes in scale or rotation. The higher level is a two- layers BPN which recives the recalled information from the memorized database of the lower level. This neural networks use a BPN after the DAM can raise the recognition ratio in comparison with a DAM, and be faster than a BPN.In the case of (4), the training of the BPN speeds up much because this neural networks use a associative memory of a DAM as initial weights of the first layer of te BPN. Experiment results show that the system can recognize all the patterns correctly when the percentage of the white noises is under under 20% for the case (3) and (4). Pau-Choo Chung 詹寶珠 1994 學位論文 ; thesis 60 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立成功大學 === 電機工程研究所 === 82 === In this paper, a system included image preprocessing and neural
networks is proposed. The various function units of the image
processing are used to obtain an invariant image representation
in the beginning of the system. The space of the neural
networks weights can be reduced by using the reduction of
the feature dimension before the preprocessed feature applied
to the networks.Then, several kinds of the neural models are
proposed for pattern recognition : (1) distributed associative
memory (DAM), (2) backpropagation network(BPN), (3)DAM combined
with BPN, and(4)BPN with the associative memory as initial
weights. In the case of (3), this hierarchical networks consist
of two levels of neural networks. In the low level, a DAM
receives the output vectors of image preprocessing functions to
create a system which recognizes pattern regardless of changes
in scale or rotation. The higher level is a two- layers BPN
which recives the recalled information from the memorized
database of the lower level. This neural networks use a BPN
after the DAM can raise the recognition ratio in comparison
with a DAM, and be faster than a BPN.In the case of (4), the
training of the BPN speeds up much because this neural networks
use a associative memory of a DAM as initial weights of the
first layer of te BPN. Experiment results show that the system
can recognize all the patterns correctly when the percentage of
the white noises is under under 20% for the case (3) and (4).
|
author2 |
Pau-Choo Chung |
author_facet |
Pau-Choo Chung Jau-Ling Shih 石昭玲 |
author |
Jau-Ling Shih 石昭玲 |
spellingShingle |
Jau-Ling Shih 石昭玲 2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
author_sort |
Jau-Ling Shih |
title |
2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
title_short |
2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
title_full |
2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
title_fullStr |
2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
title_full_unstemmed |
2-D Invariant Pattern Recognition Using a Backpropogation Network Improved by Distributed Associative Memory |
title_sort |
2-d invariant pattern recognition using a backpropogation network improved by distributed associative memory |
publishDate |
1994 |
url |
http://ndltd.ncl.edu.tw/handle/47410554866730966852 |
work_keys_str_mv |
AT jaulingshih 2dinvariantpatternrecognitionusingabackpropogationnetworkimprovedbydistributedassociativememory AT shízhāolíng 2dinvariantpatternrecognitionusingabackpropogationnetworkimprovedbydistributedassociativememory AT jaulingshih jiāqiángshìfǎnxiàngchuánbōwǎnglùzhīèrwéitúxíngbiànshíxìtǒng AT shízhāolíng jiāqiángshìfǎnxiàngchuánbōwǎnglùzhīèrwéitúxíngbiànshíxìtǒng |
_version_ |
1717767126152904704 |