Self-distillation for further pre-training of transformers

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 63
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Seanieko
dc.contributor.authorMinki, Kangko
dc.contributor.authorLee, Juhoko
dc.contributor.authorHwang, Sung Juko
dc.contributor.authorKawaguchi, Kenjiko
dc.date.accessioned2023-10-18T01:05:07Z-
dc.date.available2023-10-18T01:05:07Z-
dc.date.created2023-10-17-
dc.date.issued2023-05-01-
dc.identifier.citationThe Eleventh International Conference on Learning Representations-
dc.identifier.urihttp://hdl.handle.net/10203/313494-
dc.languageEnglish-
dc.publisherInternational Conference on Learning Representations-
dc.titleSelf-distillation for further pre-training of transformers-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameThe Eleventh International Conference on Learning Representations-
dc.identifier.conferencecountryRW-
dc.identifier.conferencelocationKigali-
dc.contributor.localauthorLee, Juho-
dc.contributor.localauthorHwang, Sung Ju-
dc.contributor.nonIdAuthorMinki, Kang-
dc.contributor.nonIdAuthorKawaguchi, Kenji-
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0