Abstract

Active learning techniques have gained popularity to reduce human effort in labeling data instances for inducing a classifier. When faced with large amounts of unlabeled data, such algorithms automatically identify the exemplar instances for manual annotation. More recently, there have been attempts towards a batch mode form of active learning, where a batch of data points is simultaneously selected from an unlabeled set. In this paper, we propose two novel batch mode active learning (BMAL) algorithms: BatchRank and BatchRand. We first formulate the batch selection task as an NP-hard optimization problem; we then propose two convex relaxations, one based on linear programming and the other based on semi-definite programming to solve the batch selection problem. Finally, a deterministic bound is derived on the solution quality for the first relaxation and a probabilistic bound for the second. To the best of our knowledge, this is the first research effort to derive mathematical guarantees on the solution quality of the BMAL problem. Our extensive empirical studies on $15$ binary, multi-class and multi-label challenging datasets corroborate that the proposed algorithms perform at par with the state-of-the-art techniques, deliver high quality solutions and are robust to real-world issues like label noise and class imbalance.

Original languageEnglish (US)
Article number7006697
Pages (from-to)1945-1958
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume37
Issue number10
DOIs
StatePublished - Oct 1 2015

Keywords

  • Batch Mode Active Learning
  • Optimization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Software
  • Computational Theory and Mathematics
  • Applied Mathematics
  • Medicine(all)

Fingerprint Dive into the research topics of 'Active Batch Selection via Convex Relaxations with Guaranteed Solution Bounds'. Together they form a unique fingerprint.

  • Cite this