Summary: | 碩士 === 國立臺灣科技大學 === 電機工程系 === 98 === In this thesis, we propose the least trimmed squares-support vector regression
radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR.
The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large
noises for modeling problem because the traditional leaning algorithm is not robust.
The LTS-SVR RBFN adopts the two stages strategies. In the first stage, the LTS-SVR
solves the problems of the initial values of the RBFN and the LTS. Besides, the new
subsample which does not the outliers and large noises is preserved after the LTSSVR
procedure. Hence, the gradient-descent learning algorithm is directly employed
to adjust the parameters of the RBFN after the fist stage. The purpose of the
transformation based LTS-SVR RBFN degrades the influence of the
heteroscedasticity/skewness noises in training data set. The most of learning
algorithms are regarded as the statistical nonlinear regression model which is assumed
the constant noise level. However, the heteroscedasticity/skewness noises always
exist in the real world. The transformation based LTS-SVR RBFN employs the Box-
Cox transformation and the LTS-SVR procedure to address this problem. Finally, the
LTS-SVR RBFN and the transformation based LTS-SVR RBFN model different
functions and systems with the outliers and heteroscedasticity/skewness noises,
respectively. Based on the results of experiments these approaches indeed not only
address the initial problems of the RBFN and the LTS but also are robust for the data
set with the outliers and heteroscedasticity/skewness noises.
|