Abstract
Stream-based active learning (AL) strategies minimize the labeling effort by querying labels that improve the classifier’s performance the most. So far, these strategies neglect the fact that an oracle or expert requires time to provide a queried label. We show that existing AL methods deteriorate or even fail under the influence of such verification latency. The problem with these methods is that they estimate a label’s utility on the currently available labeled data. However, when this label would arrive, some of the current data may have gotten outdated and new labels have arrived. In this article, we propose to simulate the available data at the time when the label would arrive. Therefore, our method Forgetting and Simulating (FS) forgets outdated information and simulates the delayed labels to get more realistic utility estimates. We assume to know the label’s arrival date a priori and the classifier’s training data to be bounded by a sliding window. Our extensive experiments show that FS improves stream-based AL strategies in settings with both, constant and variable verification latency.
Original language | English |
---|---|
Pages (from-to) | 2011–2036 |
Number of pages | 26 |
Journal | Machine Learning |
Volume | 111 |
Issue number | 6 |
DOIs | |
Publication status | Published - Jun 2022 |
Event | European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases 2022 - , France Duration: 19 Sept 2022 → 23 Sept 2022 https://2022.ecmlpkdd.org/ |
Bibliographical note
Funding Information:We would like to thank our colleagues from the Intelligent Embedded Systems group, in particular Lukas Rauch, Yujiang He, David Meier, and Alexander Benz. Furthermore, we thank the anonymous reviewers for their helpful comments and suggestions.
Publisher Copyright:
© 2021, The Author(s).
Keywords
- Active learning
- Classification
- Concept drift
- Evolving data streams
- Label delay
- Verification latency