DCPE co-training for classification

Jin Xu, Haibo He, Hong Man

Research output: Contribution to journalArticlepeer-review

61 Scopus citations

Abstract

Co-training is a well-known semi-supervised learning technique that applies two basic learners to train the data source, which uses the most confident unlabeled data to augment labeled data in the learning process. In the paper, we use the diversity of class probability estimation (DCPE) between two learners and propose the DCPE co-training approach. The key idea is to use DCPE to predict labels for the unlabeled data in the training process. The experimental studies with UCI data demonstrate that the DCPE co-training is robust and efficient in classification. The comparative studies with supervised learning methods and semi-supervised learning methods also demonstrate the effectiveness of the proposed approach.

Original languageEnglish
Pages (from-to)75-85
Number of pages11
JournalNeurocomputing
Volume86
DOIs
StatePublished - 1 Jun 2012

Keywords

  • Class probability estimation
  • Classification
  • Co-training
  • Diversity
  • Machine learning
  • Semi-supervised learning

Fingerprint

Dive into the research topics of 'DCPE co-training for classification'. Together they form a unique fingerprint.

Cite this