Skip to main content
placeholder image

Feature selection with kernel class separability

Journal Article


Abstract


  • Classification can often benefit from efficient feature selection. However, the presence of linearly nonseparable data, quick

    response requirement, small sample problem, and noisy features makes the feature selection quite challenging. In this work, a class

    separability criterion is developed in a high-dimensional kernel space, and feature selection is performed by the maximization of this

    criterion. To make this feature selection approach work, the issues of automatic kernel parameter tuning, numerical stability, and

    regularization for multiparameter optimization are addressed. Theoretical analysis uncovers the relationship of this criterion to the

    radius-margin bound of the Support Vector Machines (SVMs), the Kernel Fisher Discriminant Analysis (KFDA), and the kernel

    alignment criterion, thus providing more insight into feature selection with this criterion. This criterion is applied to a variety of selection modes using different search strategies. Extensive experimental study demonstrates its efficiency in delivering fast and robust feature selection.

Publication Date


  • 2008

Citation


  • Wang, L. (2008). Feature selection with kernel class separability. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30 (9), 1534-1546.

Scopus Eid


  • 2-s2.0-48049087439

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/461

Has Global Citation Frequency


Number Of Pages


  • 12

Start Page


  • 1534

End Page


  • 1546

Volume


  • 30

Issue


  • 9

Place Of Publication


  • United States

Abstract


  • Classification can often benefit from efficient feature selection. However, the presence of linearly nonseparable data, quick

    response requirement, small sample problem, and noisy features makes the feature selection quite challenging. In this work, a class

    separability criterion is developed in a high-dimensional kernel space, and feature selection is performed by the maximization of this

    criterion. To make this feature selection approach work, the issues of automatic kernel parameter tuning, numerical stability, and

    regularization for multiparameter optimization are addressed. Theoretical analysis uncovers the relationship of this criterion to the

    radius-margin bound of the Support Vector Machines (SVMs), the Kernel Fisher Discriminant Analysis (KFDA), and the kernel

    alignment criterion, thus providing more insight into feature selection with this criterion. This criterion is applied to a variety of selection modes using different search strategies. Extensive experimental study demonstrates its efficiency in delivering fast and robust feature selection.

Publication Date


  • 2008

Citation


  • Wang, L. (2008). Feature selection with kernel class separability. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30 (9), 1534-1546.

Scopus Eid


  • 2-s2.0-48049087439

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/461

Has Global Citation Frequency


Number Of Pages


  • 12

Start Page


  • 1534

End Page


  • 1546

Volume


  • 30

Issue


  • 9

Place Of Publication


  • United States