Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees
As a machine learning method, AdaBoost is widely applied to data classification and object detection because of its robustness and efficiency. AdaBoost constructs a global and optimal combination of weak classifiers based on a sample reweighting. It is known that this kind of combination improves th...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2015-01-01
|
Series: | Journal of Electrical and Computer Engineering |
Online Access: | http://dx.doi.org/10.1155/2015/835357 |
id |
doaj-4371a54b69b2400493d5208e982d21bf |
---|---|
record_format |
Article |
spelling |
doaj-4371a54b69b2400493d5208e982d21bf2021-07-02T01:41:21ZengHindawi LimitedJournal of Electrical and Computer Engineering2090-01472090-01552015-01-01201510.1155/2015/835357835357Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression TreesShuqiong Wu0Hiroshi Nagahashi1Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8503, JapanImaging Science and Engineering Laboratory, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8503, JapanAs a machine learning method, AdaBoost is widely applied to data classification and object detection because of its robustness and efficiency. AdaBoost constructs a global and optimal combination of weak classifiers based on a sample reweighting. It is known that this kind of combination improves the classification performance tremendously. As the popularity of AdaBoost increases, many variants have been proposed to improve the performance of AdaBoost. Then, a lot of comparison and review studies for AdaBoost variants have also been published. Some researchers compared different AdaBoost variants by experiments in their own fields, and others reviewed various AdaBoost variants by basically introducing these algorithms. However, there is a lack of mathematical analysis of the generalization abilities for different AdaBoost variants. In this paper, we analyze the generalization abilities of six AdaBoost variants in terms of classification margins. The six compared variants are Real AdaBoost, Gentle AdaBoost, Modest AdaBoost, Parameterized AdaBoost, Margin-pruning Boost, and Penalized AdaBoost. Finally, we use experiments to verify our analyses.http://dx.doi.org/10.1155/2015/835357 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Shuqiong Wu Hiroshi Nagahashi |
spellingShingle |
Shuqiong Wu Hiroshi Nagahashi Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees Journal of Electrical and Computer Engineering |
author_facet |
Shuqiong Wu Hiroshi Nagahashi |
author_sort |
Shuqiong Wu |
title |
Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees |
title_short |
Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees |
title_full |
Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees |
title_fullStr |
Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees |
title_full_unstemmed |
Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees |
title_sort |
analysis of generalization ability for different adaboost variants based on classification and regression trees |
publisher |
Hindawi Limited |
series |
Journal of Electrical and Computer Engineering |
issn |
2090-0147 2090-0155 |
publishDate |
2015-01-01 |
description |
As a machine learning method, AdaBoost is widely applied to data classification and object detection because of its robustness and efficiency. AdaBoost constructs a global and optimal combination of weak classifiers based on a sample reweighting. It is known that this kind of combination improves the classification performance tremendously. As the popularity of AdaBoost increases, many variants have been proposed to improve the performance of AdaBoost. Then, a lot of comparison and review studies for AdaBoost variants have also been published. Some researchers compared different AdaBoost variants by experiments in their own fields, and others reviewed various AdaBoost variants by basically introducing these algorithms. However, there is a lack of mathematical analysis of the generalization abilities for different AdaBoost variants. In this paper, we analyze the generalization abilities of six AdaBoost variants in terms of classification margins. The six compared variants are Real AdaBoost, Gentle AdaBoost, Modest AdaBoost, Parameterized AdaBoost, Margin-pruning Boost, and Penalized AdaBoost. Finally, we use experiments to verify our analyses. |
url |
http://dx.doi.org/10.1155/2015/835357 |
work_keys_str_mv |
AT shuqiongwu analysisofgeneralizationabilityfordifferentadaboostvariantsbasedonclassificationandregressiontrees AT hiroshinagahashi analysisofgeneralizationabilityfordifferentadaboostvariantsbasedonclassificationandregressiontrees |
_version_ |
1721344604517695488 |