BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 10
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorCHOI, HOYONGko
dc.contributor.authorKi, Nohyunko
dc.contributor.authorChung, Hye Wonko
dc.date.accessioned2024-09-28T02:00:10Z-
dc.date.available2024-09-28T02:00:10Z-
dc.date.created2024-09-28-
dc.date.issued2024-07-22-
dc.identifier.citation41st International Conference on Machine Learning, ICML 2024, pp.8672 - 8701-
dc.identifier.issn2640-3498-
dc.identifier.urihttp://hdl.handle.net/10203/323295-
dc.description.abstractData subset selection aims to find a smaller yet informative subset of a large dataset that can approximate the full-dataset training, addressing challenges associated with training neural networks on large-scale datasets. However, existing methods tend to specialize in either high or low selection ratio regimes, lacking a universal approach that consistently achieves competitive performance across a broad range of selection ratios. We introduce a universal and efficient data subset selection method, Best Window Selection (BWS), by proposing a method to choose the best window subset from samples ordered based on their difficulty scores. This approach offers flexibility by allowing the choice of window intervals that span from easy to difficult samples. Furthermore, we provide an efficient mechanism for selecting the best window subset by evaluating its quality using kernel ridge regression. Our experimental results demonstrate the superior performance of BWS compared to other baselines across a broad range of selection ratios over datasets, including CIFAR-10/100 and ImageNet, and the scenarios involving training from random initialization or fine-tuning of pre-trained models.-
dc.languageEnglish-
dc.publisherML Research Press-
dc.titleBWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.beginningpage8672-
dc.citation.endingpage8701-
dc.citation.publicationname41st International Conference on Machine Learning, ICML 2024-
dc.identifier.conferencecountryAU-
dc.contributor.localauthorChung, Hye Won-
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0