Carpe Diem, Seize the Samples Uncertain "at the Moment" for Adaptive Batch Selection

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 156
  • Download : 0
The accuracy of deep neural networks is significantly affected by how well mini-batches are constructed during the training step. In this paper, we propose a novel adaptive batch selection algorithm called Recency Bias that exploits the uncertain samples predicted inconsistently in recent iterations. The historical label predictions of each training sample are used to evaluate its predictive uncertainty within a sliding window. Then, the sampling probability for the next mini-batch is assigned to each training sample in proportion to its predictive uncertainty. By taking advantage of this design, Recency Bias not only accelerates the training step but also achieves a more accurate network. We demonstrate the superiority of Recency Bias by extensive evaluation on two independent tasks. Compared with existing batch selection methods, the results showed that Recency Bias reduced the test error by up to 20.97% in a fixed wall-clock training time. At the same time, it improved the training time by up to 59.32% to reach the same test error.
Publisher
Association for Computing Machinery
Issue Date
2020-10-23
Language
English
Citation

29th ACM International Conference on Information and Knowledge Management (CIKM), pp.1385 - 1394

DOI
10.1145/3340531.3411898
URI
http://hdl.handle.net/10203/277126
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0