Summary: | Abstract The Adaboost (Freund and Schapire, Eur. Conf. Comput. Learn. Theory 23–37, 1995) chooses a good set of weak classifiers in rounds. On each round, it chooses the optimal classifier (optimal feature and its threshold value) by minimizing the weighted error of classification. It also reweights training data so that the next round would focus on data that are difficult to classify. When determining the optimal feature and its threshold value, a process of classification is employed. The involved process of classification usually performs a hard decision (Viola and Jones, Rapid object detection using a boosted cascade of simple features, 2001; Joo et al., Sci. World J 2014: 1–17, 2014; Friedman et al., Ann. Stat 28:337–407, 2000). In this paper, we extend the process of classification to a soft fuzzy decision. We believe this extension could allow some flexibility to the Adaboost algorithm as well as a good performance especially when the size of a training data set is not large enough. The Adaboost algorithm, in general, assigns a same weight to each training datum on the first round of a boosting process (Freund and Schapire, Eur. Conf. Comput. Learn. Theory 23–37, 1995). We propose to assign different initial weights based on some statistical properties of involved features. In experimental results, we show that the proposed method yields higher performances compared to other ones.
|