Incorporating posterior estimates into AdaBoostстатья
Статья опубликована в журнале из списка RSCI Web of Science
Информация о цитировании статьи получена из
Scopus
Статья опубликована в журнале из перечня ВАК
Статья опубликована в журнале из списка Web of Science и/или Scopus
Дата последнего поиска статьи во внешних источниках: 28 мая 2015 г.
Аннотация:Although boosting methods [9, 23] for creating compositions of weak hypotheses are among the best methods of machine learning
developed so far [4], they are known to degrade performance in case of noisy data and overlapping classes. In this paper we
consider binary classification and propose a reduction of overlapping classes’ classification problem to a deterministic problem.
We also devise a new upper generalization bound for weighted averages of weak hypotheses, which uses posterior estimates for
training objects and is based on proposed reduction. If we are given accurate posterior estimates, this bound is lower than
existing bound by Schapire et al. [22]. We design an AdaBoost-like algorithm which optimizes proposed generalization bound
and show that when incorporated with good posterior estimates it performs better than the standard AdaBoost on real-world
data sets.