LevelBoost: A Boosting Algorithm Using Different Loss functions on Partitioned Data Sets
碩士 === 國立交通大學 === 電機與控制工程系 === 91 === AdaBoost is a boosting algorithm that iteratively gives data weights and trains abase classifier based on the weighted data. It is sensitive to the noise because of its exponential loss function. The weights assigned to the noisy data points will grow...
Main Authors: | Yu-An Shih, 施昱安 |
---|---|
Other Authors: | Chi-Cheng Jou |
Format: | Others |
Language: | en_US |
Published: |
2003
|
Online Access: | http://ndltd.ncl.edu.tw/handle/10512127926319658296 |
Similar Items
-
FlexBoost: A Flexible Boosting Algorithm With Adaptive Loss Functions
by: Yong-Seok Jeon, et al.
Published: (2019-01-01) -
RolexBoost: A Rotation-Based Boosting Algorithm With Adaptive Loss Functions
by: Dong-Hyuk Yang, et al.
Published: (2020-01-01) -
Boosting Boosting
by: Appel, Ron
Published: (2017) -
Using genetic algorithms and boosting for data preprocessing
by: Lei, Celestino
Published: (2002) -
Boosting for Learning From Imbalanced, Multiclass Data Sets
by: Abouelenien, Mohamed
Published: (2013)