Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network
Artificial neural network has been extensively consumed training model for solving pattern recognition tasks. However, training a very huge training data set using complex neural network necessitates excessively high training time. In this correspondence, a new fast Linear Adaptive Skipping Training...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2013-01-01
|
Series: | Mathematical Problems in Engineering |
Online Access: | http://dx.doi.org/10.1155/2013/346949 |
id |
doaj-c6a91e687d5a40b39b042e7be6afb8e0 |
---|---|
record_format |
Article |
spelling |
doaj-c6a91e687d5a40b39b042e7be6afb8e02020-11-24T23:03:40ZengHindawi LimitedMathematical Problems in Engineering1024-123X1563-51472013-01-01201310.1155/2013/346949346949Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural NetworkR. Manjula Devi0S. Kuppuswami1R. C. Suganthe2Department of Computer Science and Engineering, Kongu Engineering College, Erode 638 052, IndiaDepartment of Computer Science and Engineering, Kongu Engineering College, Erode 638 052, IndiaDepartment of Computer Science and Engineering, Kongu Engineering College, Erode 638 052, IndiaArtificial neural network has been extensively consumed training model for solving pattern recognition tasks. However, training a very huge training data set using complex neural network necessitates excessively high training time. In this correspondence, a new fast Linear Adaptive Skipping Training (LAST) algorithm for training artificial neural network (ANN) is instituted. The core essence of this paper is to ameliorate the training speed of ANN by exhibiting only the input samples that do not categorize perfectly in the previous epoch which dynamically reducing the number of input samples exhibited to the network at every single epoch without affecting the network’s accuracy. Thus decreasing the size of the training set can reduce the training time, thereby ameliorating the training speed. This LAST algorithm also determines how many epochs the particular input sample has to skip depending upon the successful classification of that input sample. This LAST algorithm can be incorporated into any supervised training algorithms. Experimental result shows that the training speed attained by LAST algorithm is preferably higher than that of other conventional training algorithms.http://dx.doi.org/10.1155/2013/346949 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
R. Manjula Devi S. Kuppuswami R. C. Suganthe |
spellingShingle |
R. Manjula Devi S. Kuppuswami R. C. Suganthe Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network Mathematical Problems in Engineering |
author_facet |
R. Manjula Devi S. Kuppuswami R. C. Suganthe |
author_sort |
R. Manjula Devi |
title |
Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network |
title_short |
Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network |
title_full |
Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network |
title_fullStr |
Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network |
title_full_unstemmed |
Fast Linear Adaptive Skipping Training Algorithm for Training Artificial Neural Network |
title_sort |
fast linear adaptive skipping training algorithm for training artificial neural network |
publisher |
Hindawi Limited |
series |
Mathematical Problems in Engineering |
issn |
1024-123X 1563-5147 |
publishDate |
2013-01-01 |
description |
Artificial neural network has been extensively consumed training model for solving pattern recognition tasks. However, training a very huge training data set using complex neural network necessitates excessively high training time. In this correspondence, a new fast Linear Adaptive Skipping Training (LAST) algorithm for training artificial neural network (ANN) is instituted. The core essence of this paper is to ameliorate the training speed of ANN by exhibiting only the input samples that do not categorize perfectly in the previous epoch which dynamically reducing the number of input samples exhibited to the network at every single epoch without affecting the network’s accuracy. Thus decreasing the size of the training set can reduce the training time, thereby ameliorating the training speed. This LAST algorithm also determines how many epochs the particular input sample has to skip depending upon the successful classification of that input sample. This LAST algorithm can be incorporated into any supervised training algorithms. Experimental result shows that the training speed attained by LAST algorithm is preferably higher than that of other conventional training algorithms. |
url |
http://dx.doi.org/10.1155/2013/346949 |
work_keys_str_mv |
AT rmanjuladevi fastlinearadaptiveskippingtrainingalgorithmfortrainingartificialneuralnetwork AT skuppuswami fastlinearadaptiveskippingtrainingalgorithmfortrainingartificialneuralnetwork AT rcsuganthe fastlinearadaptiveskippingtrainingalgorithmfortrainingartificialneuralnetwork |
_version_ |
1725632774864371712 |