A generalized boosting algorithm and its application to two-class chemical classification problem [An article from: Analytica Chimica Acta]
Book Details
PublisherElsevier
ISBN / ASINB000RR6WRQ
ISBN-13978B000RR6WR4
AvailabilityAvailable for download now
MarketplaceUnited States 🇺🇸
Description
This digital document is a journal article from Analytica Chimica Acta, published by Elsevier in . The article is delivered in HTML format and is available in your Amazon.com Media Library immediately after purchase. You can view it with any web browser.
Description:
Boosting is one of the most important recent developments in classification methodology. It can significantly improve the prediction performance of any single classification algorithm and has been successfully applied to many different fields including problems in chemometrics. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data, and then taking a weighted majority vote of the sequence of classifiers thus produced. In this paper, we proposed a generalized boosting algorithm via Bayes optimal decision rule. Using Bayes optimal decision rule, we adjust the weights of the sequence of classifiers in the voting process of boosting algorithm. The two types of errors are introduced into the generalized boosting and make the voting process more sensible. Meanwhile, the weights of the training samples are also correspondingly adjusted according to some criterion. The generalized boosting is applied to the binary classification for chemical data. Experimental results show that it can improve the predict accuracy compared with AdaBoost algorithm especially when the difference between the two types of errors for classification is large.
Description:
Boosting is one of the most important recent developments in classification methodology. It can significantly improve the prediction performance of any single classification algorithm and has been successfully applied to many different fields including problems in chemometrics. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data, and then taking a weighted majority vote of the sequence of classifiers thus produced. In this paper, we proposed a generalized boosting algorithm via Bayes optimal decision rule. Using Bayes optimal decision rule, we adjust the weights of the sequence of classifiers in the voting process of boosting algorithm. The two types of errors are introduced into the generalized boosting and make the voting process more sensible. Meanwhile, the weights of the training samples are also correspondingly adjusted according to some criterion. The generalized boosting is applied to the binary classification for chemical data. Experimental results show that it can improve the predict accuracy compared with AdaBoost algorithm especially when the difference between the two types of errors for classification is large.
