Co-training and Co-distillation for Quality Improvement and Compression of Language Models

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 49
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Hayeonko
dc.contributor.authorHou, Ruiko
dc.contributor.authorKim, Jongphilko
dc.contributor.authorLiang, Davisko
dc.contributor.authorZhang, Hongboko
dc.contributor.authorHwang, Sung Juko
dc.contributor.authorMin, Alexanderko
dc.date.accessioned2023-12-12T10:00:25Z-
dc.date.available2023-12-12T10:00:25Z-
dc.date.created2023-12-09-
dc.date.issued2023-12-06-
dc.identifier.citationEmpirical Methods in Natural Language Processing, EMNLP 2023-
dc.identifier.urihttp://hdl.handle.net/10203/316309-
dc.languageEnglish-
dc.publisherAssociation of Computational Linguistics-
dc.titleCo-training and Co-distillation for Quality Improvement and Compression of Language Models-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameEmpirical Methods in Natural Language Processing, EMNLP 2023-
dc.identifier.conferencecountrySI-
dc.identifier.conferencelocationResorts World Convention Centre-
dc.contributor.localauthorHwang, Sung Ju-
dc.contributor.nonIdAuthorLee, Hayeon-
dc.contributor.nonIdAuthorHou, Rui-
dc.contributor.nonIdAuthorKim, Jongphil-
dc.contributor.nonIdAuthorLiang, Davis-
dc.contributor.nonIdAuthorZhang, Hongbo-
dc.contributor.nonIdAuthorMin, Alexander-
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0