TY - GEN
T1 - L1 graph based on sparse coding for feature selection
AU - Xu, Jin
AU - Yang, Guang
AU - Man, Hong
AU - He, Haibo
PY - 2013
Y1 - 2013
N2 - In machine learning and pattern recognition, feature selection has been a very active topic in the literature. Unsupervised feature selection is challenging due to the lack of label which would supply the categorical information. How to define an appropriate metric is the key for feature selection. In this paper, we propose a "filter" method for unsupervised feature selection, which is based on the geometry properties of ℓ1 graph. ℓ1 graph is constructed through sparse coding. The graph establishes the relations of feature subspaces and the quality of features is evaluated by features' local preserving ability. We compare our method with classic unsupervised feature selection methods (Laplacian score and Pearson correlation) and supervised method (Fisher score) on benchmark data sets. The classification results based on support vector machine, k-nearest neighbors and multi-layer feed-forward networks demonstrate the efficiency and effectiveness of our method.
AB - In machine learning and pattern recognition, feature selection has been a very active topic in the literature. Unsupervised feature selection is challenging due to the lack of label which would supply the categorical information. How to define an appropriate metric is the key for feature selection. In this paper, we propose a "filter" method for unsupervised feature selection, which is based on the geometry properties of ℓ1 graph. ℓ1 graph is constructed through sparse coding. The graph establishes the relations of feature subspaces and the quality of features is evaluated by features' local preserving ability. We compare our method with classic unsupervised feature selection methods (Laplacian score and Pearson correlation) and supervised method (Fisher score) on benchmark data sets. The classification results based on support vector machine, k-nearest neighbors and multi-layer feed-forward networks demonstrate the efficiency and effectiveness of our method.
KW - Feature selection
KW - Neural network
KW - Sparse coding
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=84880715885&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84880715885&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-39065-4_71
DO - 10.1007/978-3-642-39065-4_71
M3 - Conference contribution
AN - SCOPUS:84880715885
SN - 9783642390647
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 594
EP - 601
BT - Advances in Neural Networks, ISNN 2013 - 10th International Symposium on Neural Networks, Proceedings
T2 - 10th International Symposium on Neural Networks, ISNN 2013
Y2 - 4 July 2013 through 6 July 2013
ER -