QUERYING EASILY FLIP-FLOPPED SAMPLES FOR DEEP ACTIVE LEARNING

  • Seong Jin Cho
  • , Gwangsu Kim
  • , Junghyun Lee
  • , Jinwoo Shin
  • , Chang D. Yoo*
  • *Corresponding author for this work

Research output: Conference(x)Paperpeer-review

Abstract

Active learning, a paradigm within machine learning, aims to select and query unlabeled data to enhance model performance strategically. A crucial selection strategy leverages the model's predictive uncertainty, reflecting the informativeness of a data point. While the sample's distance to the decision boundary intuitively measures predictive uncertainty, its computation becomes intractable for complex decision boundaries formed in multiclass classification tasks. This paper introduces the least disagree metric (LDM), the smallest probability of predicted label disagreement. We propose an asymptotically consistent estimator for LDM under mild assumptions. The estimator boasts computational efficiency and straightforward implementation for deep learning models using parameter perturbation. The LDM-based active learning algorithm queries unlabeled data with the smallest LDM, achieving state-of-the-art overall performance across various datasets and deep architectures, as demonstrated by the experimental results.

Original languageEnglish
StatePublished - 2024
Event12th International Conference on Learning Representations, ICLR 2024 - Hybrid, Vienna, Austria
Duration: 2024.05.72024.05.11

Conference

Conference12th International Conference on Learning Representations, ICLR 2024
Country/TerritoryAustria
CityHybrid, Vienna
Period24.05.724.05.11

Quacquarelli Symonds(QS) Subject Topics

  • Linguistics
  • Computer Science & Information Systems
  • Data Science
  • Education & Training

Fingerprint

Dive into the research topics of 'QUERYING EASILY FLIP-FLOPPED SAMPLES FOR DEEP ACTIVE LEARNING'. Together they form a unique fingerprint.

Cite this