Divide-and-Conquer Learning and Modular Perceptron Network
博士 === 國立交通大學 === 資訊工程系 === 90 === A novel Modular Perceptron Network (MPN) and Divide-and-Conquer Learning (DCL) schemes with Weight Estimation for the design of modular neural networks are proposed. Finally the perceptron networks is performed on Dimensionality reduction with obvious effectiveness...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2002
|
Online Access: | http://ndltd.ncl.edu.tw/handle/59652302687861525600 |
id |
ndltd-TW-090NCTU0392008 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-090NCTU03920082016-06-27T16:08:59Z http://ndltd.ncl.edu.tw/handle/59652302687861525600 Divide-and-Conquer Learning and Modular Perceptron Network 分割且克服之學習與模組化感知器神經網路之研究 Yen-Po Lee 李衍博 博士 國立交通大學 資訊工程系 90 A novel Modular Perceptron Network (MPN) and Divide-and-Conquer Learning (DCL) schemes with Weight Estimation for the design of modular neural networks are proposed. Finally the perceptron networks is performed on Dimensionality reduction with obvious effectiveness for solution in Curse of Dimensionality. When a training process in a multilayer perceptron falls into a local minimum or stalls in a flat region, the proposed DCL scheme is applied to divide the current training data region (e.g., a hard to be learned training set) into two easier (hopely) to be learned regions. The learning process continues when a self-growing perceptron network and its initial weight estimation are constructed for one of the newly partitioned regions. Another partitioned region will resume the training process on the original perceptron network. Data region partitioning, weight estimating and learning are iteratively repeated until all the training data are completely learned by the MPN. We have evaluated and compared the proposed MPN with several representative neural networks on the Two-spirals problem and real-world database. On learning the the Two-spirals problem, the MPN achieves better weight learning performance by requiring much less data presentations ( less 99.01%~ 87.86%) during the network training phases, and better generalization performance (4.0% better), and less processing time (less 2.0% ~ 81.3%) during the retrieving phase. On learning the real-world data, the MPNs show the same performance as above problem learning results and less overfitting compared to single MLP. Otherwise, for solution in curse of dimensionality, induced from high dimensional input data during learning process. An evaluation algorithm is established in this paper to overcome this problem. It shows that from the experimental results the dimension of the input data can be largely decreased (less 75% ~ 88%) and the data presentations are also reduced (less 70% ~ 90%). So a small size MPNs can be procured with learning and testing performance maintained as the good level as before. In addition, due to its self-growing and fast local learning characteristics, the modular network MPN can easily adapt to on-line and/or incremental learning requirements for a rapid changing environment. Prof. Hsin-Chia Fu 傅心家 2002 學位論文 ; thesis 108 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
博士 === 國立交通大學 === 資訊工程系 === 90 === A novel Modular Perceptron Network (MPN) and Divide-and-Conquer Learning (DCL) schemes with Weight Estimation for the design of modular neural networks are proposed. Finally the perceptron networks is performed on Dimensionality reduction with obvious effectiveness for solution in Curse of Dimensionality.
When a training process in a multilayer perceptron falls into a local minimum or stalls in a flat region, the proposed DCL scheme is applied to divide the current training data region (e.g., a hard to be learned training set) into two easier (hopely) to be learned regions. The learning process continues when a self-growing perceptron network and its initial weight estimation are constructed for one of the newly partitioned regions. Another partitioned region will resume the training process on the original perceptron network. Data region partitioning, weight estimating and learning are iteratively repeated until all the training data are completely learned by the MPN. We have evaluated and compared the proposed MPN with several representative neural networks on the Two-spirals problem and real-world database.
On learning the the Two-spirals problem, the MPN achieves better weight learning performance by requiring much less data presentations ( less 99.01%~ 87.86%) during the network training phases, and better generalization performance (4.0% better), and less processing time (less 2.0% ~ 81.3%) during the retrieving phase. On learning the real-world data, the MPNs show the same performance as above problem learning results and less overfitting compared to single MLP. Otherwise, for solution in curse of dimensionality, induced from high dimensional input data during learning process. An evaluation algorithm is established in this paper to overcome this problem. It shows that from the experimental results the dimension of the input data can be largely decreased (less 75% ~ 88%) and the data presentations are also reduced (less 70% ~ 90%). So a small size MPNs can be procured with learning and testing performance maintained as the good level as before. In addition, due to its self-growing and fast local learning characteristics, the modular network MPN can easily adapt to on-line and/or incremental learning requirements for a rapid changing environment.
|
author2 |
Prof. Hsin-Chia Fu |
author_facet |
Prof. Hsin-Chia Fu Yen-Po Lee 李衍博 |
author |
Yen-Po Lee 李衍博 |
spellingShingle |
Yen-Po Lee 李衍博 Divide-and-Conquer Learning and Modular Perceptron Network |
author_sort |
Yen-Po Lee |
title |
Divide-and-Conquer Learning and Modular Perceptron Network |
title_short |
Divide-and-Conquer Learning and Modular Perceptron Network |
title_full |
Divide-and-Conquer Learning and Modular Perceptron Network |
title_fullStr |
Divide-and-Conquer Learning and Modular Perceptron Network |
title_full_unstemmed |
Divide-and-Conquer Learning and Modular Perceptron Network |
title_sort |
divide-and-conquer learning and modular perceptron network |
publishDate |
2002 |
url |
http://ndltd.ncl.edu.tw/handle/59652302687861525600 |
work_keys_str_mv |
AT yenpolee divideandconquerlearningandmodularperceptronnetwork AT lǐyǎnbó divideandconquerlearningandmodularperceptronnetwork AT yenpolee fēngēqiěkèfúzhīxuéxíyǔmózǔhuàgǎnzhīqìshénjīngwǎnglùzhīyánjiū AT lǐyǎnbó fēngēqiěkèfúzhīxuéxíyǔmózǔhuàgǎnzhīqìshénjīngwǎnglùzhīyánjiū |
_version_ |
1718324429081018368 |