Skip to main content
placeholder image

AdaBoost with SVM-based component classifiers

Journal Article


Download full-text (Open Access)

Abstract


  • The use of SVM (Support Vector Machine) as component classifier in AdaBoost may seem like going against the grain of the Boosting

    principle since SVM is not an easy classifier to train. Moreover, Wickramaratna et al. [2001. Performance degradation in boosting. In:

    Proceedings of the Second International Workshop on Multiple Classifier Systems, pp. 11–21] show that AdaBoost with strong

    component classifiers is not viable. In this paper, we shall show that AdaBoost incorporating properly designed RBFSVM (SVM with the RBF kernel) component classifiers, which we call AdaBoostSVM, can perform as well as SVM. Furthermore, the proposed AdaBoostSVM demonstrates better generalization performance than SVM on imbalanced classification problems. The key idea of AdaBoostSVM is that for the sequence of trained RBFSVM component classifiers, starting with large s values (implying weak learning), the s values are reduced progressively as the Boosting iteration proceeds. This effectively produces a set of

    RBFSVM component classifiers whose model parameters are adaptively different manifesting in better generalization as compared to

    AdaBoost approach with SVM component classifiers using a fixed (optimal) s value. From benchmark data sets, we show that our

    AdaBoostSVM approach outperforms other AdaBoost approaches using component classifiers such as Decision Trees and Neural

    Networks. AdaBoostSVM can be seen as a proof of concept of the idea proposed in Valentini and Dietterich [2004. Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research 5, 725–775] that Adaboost with heterogeneous SVMs could work well. Moreover, we extend AdaBoostSVM to the Diverse AdaBoostSVM to address the reported accuracy/diversity dilemma of the original Adaboost. By designing parameter adjusting strategies, the distributions of accuracy and diversity over RBFSVM component classifiers are tuned to maintain a good balance between them and promising results have been obtained on benchmark data sets.

Authors


  •   Li, Xuchun (external author)
  •   Wang, Lei
  •   Sung, Eric (external author)

Publication Date


  • 2008

Citation


  • lI, X., Wang, L. & Sung, E. (2008). AdaBoost with SVM-based component classifiers. Engineering Applications of Artificial Intelligence, 21 (5), 785-795.

Scopus Eid


  • 2-s2.0-44649197212

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1608&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/602

Has Global Citation Frequency


Number Of Pages


  • 10

Start Page


  • 785

End Page


  • 795

Volume


  • 21

Issue


  • 5

Place Of Publication


  • United Kingdom

Abstract


  • The use of SVM (Support Vector Machine) as component classifier in AdaBoost may seem like going against the grain of the Boosting

    principle since SVM is not an easy classifier to train. Moreover, Wickramaratna et al. [2001. Performance degradation in boosting. In:

    Proceedings of the Second International Workshop on Multiple Classifier Systems, pp. 11–21] show that AdaBoost with strong

    component classifiers is not viable. In this paper, we shall show that AdaBoost incorporating properly designed RBFSVM (SVM with the RBF kernel) component classifiers, which we call AdaBoostSVM, can perform as well as SVM. Furthermore, the proposed AdaBoostSVM demonstrates better generalization performance than SVM on imbalanced classification problems. The key idea of AdaBoostSVM is that for the sequence of trained RBFSVM component classifiers, starting with large s values (implying weak learning), the s values are reduced progressively as the Boosting iteration proceeds. This effectively produces a set of

    RBFSVM component classifiers whose model parameters are adaptively different manifesting in better generalization as compared to

    AdaBoost approach with SVM component classifiers using a fixed (optimal) s value. From benchmark data sets, we show that our

    AdaBoostSVM approach outperforms other AdaBoost approaches using component classifiers such as Decision Trees and Neural

    Networks. AdaBoostSVM can be seen as a proof of concept of the idea proposed in Valentini and Dietterich [2004. Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research 5, 725–775] that Adaboost with heterogeneous SVMs could work well. Moreover, we extend AdaBoostSVM to the Diverse AdaBoostSVM to address the reported accuracy/diversity dilemma of the original Adaboost. By designing parameter adjusting strategies, the distributions of accuracy and diversity over RBFSVM component classifiers are tuned to maintain a good balance between them and promising results have been obtained on benchmark data sets.

Authors


  •   Li, Xuchun (external author)
  •   Wang, Lei
  •   Sung, Eric (external author)

Publication Date


  • 2008

Citation


  • lI, X., Wang, L. & Sung, E. (2008). AdaBoost with SVM-based component classifiers. Engineering Applications of Artificial Intelligence, 21 (5), 785-795.

Scopus Eid


  • 2-s2.0-44649197212

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1608&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/602

Has Global Citation Frequency


Number Of Pages


  • 10

Start Page


  • 785

End Page


  • 795

Volume


  • 21

Issue


  • 5

Place Of Publication


  • United Kingdom