Skip to main content
placeholder image

A method of discriminative information preservation and in-dimension distance minimization method for feature selection

Conference Paper


Abstract


  • © 2014 IEEE. Preserving sample's pair wise similarity is essential for feature selection. In supervised learning, labels can be used as a direct measure to check whether two samples are similar with each other. In unsupervised learning, however, such similarity information is usually unavailable. In this paper, we propose a new feature selection method through spectral clustering based on discriminative information as an underlying data structure. Laplacian matrix is used to obtain more partitioning information than other previously proposed structures such as the Eigen space of original data. The high dimension of sample data is projected into a low dimensional space. The in-dimension distance is also considered to get a better compact clustering result. The proposed method can be solved efficiently by updating the projection matrix and its inverse normalized diagonal matrix. A comprehensive experimental study has demonstrated that the proposed method outperforms many state-of-the-art feature selection algorithms with different criterion including the accuracy of clustering/classification and Jaccard score.

Authors


  •   Huang, Shangrong (external author)
  •   Zhang, Jian (external author)
  •   Liu, Xinwang (external author)
  •   Wang, Lei

Publication Date


  • 2014

Citation


  • Huang, S., Zhang, J., Liu, X. & Wang, L. (2014). A method of discriminative information preservation and in-dimension distance minimization method for feature selection. 22nd International Conference on Pattern Recognition (ICPR 2014) (pp. 1615-1620). United States: IEEE Computer Society.

Scopus Eid


  • 2-s2.0-84919912273

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/3286

Start Page


  • 1615

End Page


  • 1620

Place Of Publication


  • http://ieeexplore.ieee.org/xpl/abstractAuthors.jsp?arnumber=6976996&tag=1

Abstract


  • © 2014 IEEE. Preserving sample's pair wise similarity is essential for feature selection. In supervised learning, labels can be used as a direct measure to check whether two samples are similar with each other. In unsupervised learning, however, such similarity information is usually unavailable. In this paper, we propose a new feature selection method through spectral clustering based on discriminative information as an underlying data structure. Laplacian matrix is used to obtain more partitioning information than other previously proposed structures such as the Eigen space of original data. The high dimension of sample data is projected into a low dimensional space. The in-dimension distance is also considered to get a better compact clustering result. The proposed method can be solved efficiently by updating the projection matrix and its inverse normalized diagonal matrix. A comprehensive experimental study has demonstrated that the proposed method outperforms many state-of-the-art feature selection algorithms with different criterion including the accuracy of clustering/classification and Jaccard score.

Authors


  •   Huang, Shangrong (external author)
  •   Zhang, Jian (external author)
  •   Liu, Xinwang (external author)
  •   Wang, Lei

Publication Date


  • 2014

Citation


  • Huang, S., Zhang, J., Liu, X. & Wang, L. (2014). A method of discriminative information preservation and in-dimension distance minimization method for feature selection. 22nd International Conference on Pattern Recognition (ICPR 2014) (pp. 1615-1620). United States: IEEE Computer Society.

Scopus Eid


  • 2-s2.0-84919912273

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/3286

Start Page


  • 1615

End Page


  • 1620

Place Of Publication


  • http://ieeexplore.ieee.org/xpl/abstractAuthors.jsp?arnumber=6976996&tag=1