Representational Continuity for Unsupervised Continual Learning

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 424
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorMadaan, Divyamko
dc.contributor.authorYoon, Jaehongko
dc.contributor.authorLi, Yuanchunko
dc.contributor.authorLiu, Yunxinko
dc.contributor.authorHwang, Sung Juko
dc.date.accessioned2022-12-05T00:00:32Z-
dc.date.available2022-12-05T00:00:32Z-
dc.date.created2022-12-05-
dc.date.issued2022-04-25-
dc.identifier.citation10th International Conference on Learning Representations, ICLR 2022-
dc.identifier.urihttp://hdl.handle.net/10203/301569-
dc.description.abstractContinual learning (CL) aims to learn a sequence of tasks without forgetting the previously acquired knowledge. However, recent CL advances are restricted to supervised continual learning (SCL) scenarios. Consequently, they are not scalable to real-world applications where the data distribution is often biased and unannotated. In this work, we focus on unsupervised continual learning (UCL), where we learn the feature representations on an unlabelled sequence of tasks and show that reliance on annotated data is not necessary for continual learning. We conduct a systematic study analyzing the learned feature representations and show that unsupervised visual representations are surprisingly more robust to catastrophic forgetting, consistently achieve better performance, and generalize better to out-ofdistribution tasks than SCL. Furthermore, we find that UCL achieves a smoother loss landscape through qualitative analysis of the learned representations and learns meaningful feature representations. Additionally, we propose Lifelong Unsupervised Mixup (LUMP), a simple yet effective technique that interpolates between the current task and previous tasks’ instances to alleviate catastrophic forgetting for unsupervised representations. We release our code online.-
dc.languageEnglish-
dc.publisherInternational Conference on Learning Representations, ICLR-
dc.titleRepresentational Continuity for Unsupervised Continual Learning-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationname10th International Conference on Learning Representations, ICLR 2022-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationVirtual-
dc.contributor.localauthorHwang, Sung Ju-
dc.contributor.nonIdAuthorMadaan, Divyam-
dc.contributor.nonIdAuthorLi, Yuanchun-
dc.contributor.nonIdAuthorLiu, Yunxin-
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0