Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application

碩士 === 國立臺灣科技大學 === 電機工程系 === 98 === In this thesis, we propose the least trimmed squares-support vector regression radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR. The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large noises for modeling prob...

Full description

Bibliographic Details
Main Authors: Yue-Shiang Liu, 劉岳翔
Other Authors: Shun-Feng Su
Format: Others
Language:en_US
Published: 2010
Online Access:http://ndltd.ncl.edu.tw/handle/05302393733029979823
Description
Summary:碩士 === 國立臺灣科技大學 === 電機工程系 === 98 === In this thesis, we propose the least trimmed squares-support vector regression radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR. The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large noises for modeling problem because the traditional leaning algorithm is not robust. The LTS-SVR RBFN adopts the two stages strategies. In the first stage, the LTS-SVR solves the problems of the initial values of the RBFN and the LTS. Besides, the new subsample which does not the outliers and large noises is preserved after the LTSSVR procedure. Hence, the gradient-descent learning algorithm is directly employed to adjust the parameters of the RBFN after the fist stage. The purpose of the transformation based LTS-SVR RBFN degrades the influence of the heteroscedasticity/skewness noises in training data set. The most of learning algorithms are regarded as the statistical nonlinear regression model which is assumed the constant noise level. However, the heteroscedasticity/skewness noises always exist in the real world. The transformation based LTS-SVR RBFN employs the Box- Cox transformation and the LTS-SVR procedure to address this problem. Finally, the LTS-SVR RBFN and the transformation based LTS-SVR RBFN model different functions and systems with the outliers and heteroscedasticity/skewness noises, respectively. Based on the results of experiments these approaches indeed not only address the initial problems of the RBFN and the LTS but also are robust for the data set with the outliers and heteroscedasticity/skewness noises.