Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application
碩士 === 國立臺灣科技大學 === 電機工程系 === 98 === In this thesis, we propose the least trimmed squares-support vector regression radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR. The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large noises for modeling prob...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2010
|
Online Access: | http://ndltd.ncl.edu.tw/handle/05302393733029979823 |
id |
ndltd-TW-098NTUS5442029 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-098NTUS54420292016-04-22T04:23:32Z http://ndltd.ncl.edu.tw/handle/05302393733029979823 Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application 徑向基底類神經網路對大誤差與異變數/歪斜雜訊資料之改良與應用 Yue-Shiang Liu 劉岳翔 碩士 國立臺灣科技大學 電機工程系 98 In this thesis, we propose the least trimmed squares-support vector regression radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR. The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large noises for modeling problem because the traditional leaning algorithm is not robust. The LTS-SVR RBFN adopts the two stages strategies. In the first stage, the LTS-SVR solves the problems of the initial values of the RBFN and the LTS. Besides, the new subsample which does not the outliers and large noises is preserved after the LTSSVR procedure. Hence, the gradient-descent learning algorithm is directly employed to adjust the parameters of the RBFN after the fist stage. The purpose of the transformation based LTS-SVR RBFN degrades the influence of the heteroscedasticity/skewness noises in training data set. The most of learning algorithms are regarded as the statistical nonlinear regression model which is assumed the constant noise level. However, the heteroscedasticity/skewness noises always exist in the real world. The transformation based LTS-SVR RBFN employs the Box- Cox transformation and the LTS-SVR procedure to address this problem. Finally, the LTS-SVR RBFN and the transformation based LTS-SVR RBFN model different functions and systems with the outliers and heteroscedasticity/skewness noises, respectively. Based on the results of experiments these approaches indeed not only address the initial problems of the RBFN and the LTS but also are robust for the data set with the outliers and heteroscedasticity/skewness noises. Shun-Feng Su 蘇順豐 2010 學位論文 ; thesis 111 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣科技大學 === 電機工程系 === 98 === In this thesis, we propose the least trimmed squares-support vector regression
radial basis function network (LTS-SVR RBFN) and transformation based LTS-SVR.
The aim of the LTS-SVR RBFN degrades the affection of the outliers and the large
noises for modeling problem because the traditional leaning algorithm is not robust.
The LTS-SVR RBFN adopts the two stages strategies. In the first stage, the LTS-SVR
solves the problems of the initial values of the RBFN and the LTS. Besides, the new
subsample which does not the outliers and large noises is preserved after the LTSSVR
procedure. Hence, the gradient-descent learning algorithm is directly employed
to adjust the parameters of the RBFN after the fist stage. The purpose of the
transformation based LTS-SVR RBFN degrades the influence of the
heteroscedasticity/skewness noises in training data set. The most of learning
algorithms are regarded as the statistical nonlinear regression model which is assumed
the constant noise level. However, the heteroscedasticity/skewness noises always
exist in the real world. The transformation based LTS-SVR RBFN employs the Box-
Cox transformation and the LTS-SVR procedure to address this problem. Finally, the
LTS-SVR RBFN and the transformation based LTS-SVR RBFN model different
functions and systems with the outliers and heteroscedasticity/skewness noises,
respectively. Based on the results of experiments these approaches indeed not only
address the initial problems of the RBFN and the LTS but also are robust for the data
set with the outliers and heteroscedasticity/skewness noises.
|
author2 |
Shun-Feng Su |
author_facet |
Shun-Feng Su Yue-Shiang Liu 劉岳翔 |
author |
Yue-Shiang Liu 劉岳翔 |
spellingShingle |
Yue-Shiang Liu 劉岳翔 Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
author_sort |
Yue-Shiang Liu |
title |
Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
title_short |
Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
title_full |
Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
title_fullStr |
Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
title_full_unstemmed |
Improved Radial Basis Function Neural Network for the Outliers and Heteroscedasticity/Skewness Noises and Its Application |
title_sort |
improved radial basis function neural network for the outliers and heteroscedasticity/skewness noises and its application |
publishDate |
2010 |
url |
http://ndltd.ncl.edu.tw/handle/05302393733029979823 |
work_keys_str_mv |
AT yueshiangliu improvedradialbasisfunctionneuralnetworkfortheoutliersandheteroscedasticityskewnessnoisesanditsapplication AT liúyuèxiáng improvedradialbasisfunctionneuralnetworkfortheoutliersandheteroscedasticityskewnessnoisesanditsapplication AT yueshiangliu jìngxiàngjīdǐlèishénjīngwǎnglùduìdàwùchàyǔyìbiànshùwāixiézáxùnzīliàozhīgǎiliángyǔyīngyòng AT liúyuèxiáng jìngxiàngjīdǐlèishénjīngwǎnglùduìdàwùchàyǔyìbiànshùwāixiézáxùnzīliàozhīgǎiliángyǔyīngyòng |
_version_ |
1718231353299828736 |