Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale

Self-organizing growth neural network (SOGNN) is an unsupervised clustering algorithm based on competitive learning, which can extract the distribution information and topology information of input data. However, the current SOGNN lacks effective and stable judgment indicators for the scale of netwo...

Full description

Bibliographic Details
Main Authors: Chaoliang Zhong, Yao Zhou
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8836505/
id doaj-3ad92bdb145048dbbecdc7f329670de1
record_format Article
spelling doaj-3ad92bdb145048dbbecdc7f329670de12021-04-05T17:30:16ZengIEEEIEEE Access2169-35362019-01-01713456413457310.1109/ACCESS.2019.29412118836505Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network ScaleChaoliang Zhong0https://orcid.org/0000-0001-8981-1982Yao Zhou1School of Automation, Hangzhou Dianzi University, Hangzhou, ChinaSchool of Automation, Hangzhou Dianzi University, Hangzhou, ChinaSelf-organizing growth neural network (SOGNN) is an unsupervised clustering algorithm based on competitive learning, which can extract the distribution information and topology information of input data. However, the current SOGNN lacks effective and stable judgment indicators for the scale of network node growth, thereby failing to effectively evaluate and control the reasonable size of the output space. To this end, this paper proposes a judgment index, i.e., expected quantization error stability (EQES), to objectively judge the approximation degree of the output space to the input space. Based on the Growing Neural Gas (GNG) algorithm, an improved GNG algorithm, called GNG-EQES, is proposed, which introduces the EQES criterion to enable the GNG algorithm to generate an appropriate number of output network nodes autonomously without pre-determining the size of the output network. This not only improves the accuracy of feature extraction of the SOGNN, but also improves its adaptive ability and expands its application scope. The experiments in continuous input space and discrete input space have verified the validity and feasibility of the method proposed, and the method is applied to the construction of mobile robot environment topology map.https://ieeexplore.ieee.org/document/8836505/Growing neural gasself-organization neural networkunsupervised learningcompetitive learning
collection DOAJ
language English
format Article
sources DOAJ
author Chaoliang Zhong
Yao Zhou
spellingShingle Chaoliang Zhong
Yao Zhou
Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
IEEE Access
Growing neural gas
self-organization neural network
unsupervised learning
competitive learning
author_facet Chaoliang Zhong
Yao Zhou
author_sort Chaoliang Zhong
title Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
title_short Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
title_full Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
title_fullStr Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
title_full_unstemmed Expected Quantization Error Stability-Based Self-Organizing Growth Neural Network With Adaptive Output Network Scale
title_sort expected quantization error stability-based self-organizing growth neural network with adaptive output network scale
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2019-01-01
description Self-organizing growth neural network (SOGNN) is an unsupervised clustering algorithm based on competitive learning, which can extract the distribution information and topology information of input data. However, the current SOGNN lacks effective and stable judgment indicators for the scale of network node growth, thereby failing to effectively evaluate and control the reasonable size of the output space. To this end, this paper proposes a judgment index, i.e., expected quantization error stability (EQES), to objectively judge the approximation degree of the output space to the input space. Based on the Growing Neural Gas (GNG) algorithm, an improved GNG algorithm, called GNG-EQES, is proposed, which introduces the EQES criterion to enable the GNG algorithm to generate an appropriate number of output network nodes autonomously without pre-determining the size of the output network. This not only improves the accuracy of feature extraction of the SOGNN, but also improves its adaptive ability and expands its application scope. The experiments in continuous input space and discrete input space have verified the validity and feasibility of the method proposed, and the method is applied to the construction of mobile robot environment topology map.
topic Growing neural gas
self-organization neural network
unsupervised learning
competitive learning
url https://ieeexplore.ieee.org/document/8836505/
work_keys_str_mv AT chaoliangzhong expectedquantizationerrorstabilitybasedselforganizinggrowthneuralnetworkwithadaptiveoutputnetworkscale
AT yaozhou expectedquantizationerrorstabilitybasedselforganizinggrowthneuralnetworkwithadaptiveoutputnetworkscale
_version_ 1721539398798934016