Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network

A non-iterative learning algorithm for artificial neural networks is an alternative to optimize the neural network parameters with extremely fast convergence time. Extreme learning machine (ELM) is one of the fastest learning algorithms based on a non-iterative method for a single hidden layer feedf...

Full description

Bibliographic Details
Main Authors: Syukron Abu Ishaq Alfarozi, Kitsuchart Pasupa, Masanori Sugimoto, Kuntpong Woraratpanya
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8967052/
id doaj-cb625e2b2d21440793a463a8dd70cc17
record_format Article
spelling doaj-cb625e2b2d21440793a463a8dd70cc172021-03-30T01:10:13ZengIEEEIEEE Access2169-35362020-01-018203422036210.1109/ACCESS.2020.29689838967052Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural NetworkSyukron Abu Ishaq Alfarozi0https://orcid.org/0000-0001-8558-898XKitsuchart Pasupa1https://orcid.org/0000-0001-8359-9888Masanori Sugimoto2Kuntpong Woraratpanya3https://orcid.org/0000-0002-8337-4563Faculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok, ThailandFaculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok, ThailandGraduate School of Information Science and Technology, Hokkaido University, Sapporo, JapanFaculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok, ThailandA non-iterative learning algorithm for artificial neural networks is an alternative to optimize the neural network parameters with extremely fast convergence time. Extreme learning machine (ELM) is one of the fastest learning algorithms based on a non-iterative method for a single hidden layer feedforward neural network (SLFN) model. ELM uses a randomization technique that requires a large number of hidden nodes to achieve the high accuracy. This leads to a large and complex model, which is slow at the inference time. Previously, we reported analytical incremental learning (AIL) algorithm, which is a compact model and a non-iterative deterministic learning algorithm, to be used as an alternative. However, AIL cannot grow its set of hidden nodes, due to the node saturation problem. Here, we describe a local sigmoid method (LSM) that is also a sufficiently compact model and a non-iterative deterministic learning algorithm to overcome both the ELM randomization and AIL node saturation problems. The LSM algorithm is based on “divide and conquer” method that divides the dataset into several subsets which are easier to optimize separately. Each subset can be associated with a local segment represented as a hidden node that preserves local information of the subset. This technique helps us to understand the function of each hidden node of the network built. Moreover, we can use such a technique to explain the function of hidden nodes learned by backpropagation, the iterative algorithm. Based on our experimental results, LSM is more accurate than other non-iterative learning algorithms and one of the most compact models.https://ieeexplore.ieee.org/document/8967052/Neural networkcompact modelhidden node interpretationsigmoid functionfunction approximationslope information
collection DOAJ
language English
format Article
sources DOAJ
author Syukron Abu Ishaq Alfarozi
Kitsuchart Pasupa
Masanori Sugimoto
Kuntpong Woraratpanya
spellingShingle Syukron Abu Ishaq Alfarozi
Kitsuchart Pasupa
Masanori Sugimoto
Kuntpong Woraratpanya
Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
IEEE Access
Neural network
compact model
hidden node interpretation
sigmoid function
function approximation
slope information
author_facet Syukron Abu Ishaq Alfarozi
Kitsuchart Pasupa
Masanori Sugimoto
Kuntpong Woraratpanya
author_sort Syukron Abu Ishaq Alfarozi
title Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
title_short Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
title_full Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
title_fullStr Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
title_full_unstemmed Local Sigmoid Method: Non-Iterative Deterministic Learning Algorithm for Automatic Model Construction of Neural Network
title_sort local sigmoid method: non-iterative deterministic learning algorithm for automatic model construction of neural network
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2020-01-01
description A non-iterative learning algorithm for artificial neural networks is an alternative to optimize the neural network parameters with extremely fast convergence time. Extreme learning machine (ELM) is one of the fastest learning algorithms based on a non-iterative method for a single hidden layer feedforward neural network (SLFN) model. ELM uses a randomization technique that requires a large number of hidden nodes to achieve the high accuracy. This leads to a large and complex model, which is slow at the inference time. Previously, we reported analytical incremental learning (AIL) algorithm, which is a compact model and a non-iterative deterministic learning algorithm, to be used as an alternative. However, AIL cannot grow its set of hidden nodes, due to the node saturation problem. Here, we describe a local sigmoid method (LSM) that is also a sufficiently compact model and a non-iterative deterministic learning algorithm to overcome both the ELM randomization and AIL node saturation problems. The LSM algorithm is based on “divide and conquer” method that divides the dataset into several subsets which are easier to optimize separately. Each subset can be associated with a local segment represented as a hidden node that preserves local information of the subset. This technique helps us to understand the function of each hidden node of the network built. Moreover, we can use such a technique to explain the function of hidden nodes learned by backpropagation, the iterative algorithm. Based on our experimental results, LSM is more accurate than other non-iterative learning algorithms and one of the most compact models.
topic Neural network
compact model
hidden node interpretation
sigmoid function
function approximation
slope information
url https://ieeexplore.ieee.org/document/8967052/
work_keys_str_mv AT syukronabuishaqalfarozi localsigmoidmethodnoniterativedeterministiclearningalgorithmforautomaticmodelconstructionofneuralnetwork
AT kitsuchartpasupa localsigmoidmethodnoniterativedeterministiclearningalgorithmforautomaticmodelconstructionofneuralnetwork
AT masanorisugimoto localsigmoidmethodnoniterativedeterministiclearningalgorithmforautomaticmodelconstructionofneuralnetwork
AT kuntpongworaratpanya localsigmoidmethodnoniterativedeterministiclearningalgorithmforautomaticmodelconstructionofneuralnetwork
_version_ 1724187632811900928