Knowledge-Augmented Reasoning Distillation for Small Language Models in Knowledge-Intensive Tasks

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 156
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKang, Minkiko
dc.contributor.authorLee, Seanieko
dc.contributor.authorBaek, Jinheonko
dc.contributor.authorKenji, Kawaguchiko
dc.contributor.authorHwang, Sung Juko
dc.date.accessioned2023-12-12T10:00:15Z-
dc.date.available2023-12-12T10:00:15Z-
dc.date.created2023-12-09-
dc.date.issued2023-12-14-
dc.identifier.citation37th Conference on Neural Information Processing Systems, NeurIPS 2023-
dc.identifier.urihttp://hdl.handle.net/10203/316308-
dc.languageEnglish-
dc.publisherNeural Information Processing Systems Foundatio-
dc.titleKnowledge-Augmented Reasoning Distillation for Small Language Models in Knowledge-Intensive Tasks-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationname37th Conference on Neural Information Processing Systems, NeurIPS 2023-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationNew Orleans Ernest N. Morial Convention Center-
dc.contributor.localauthorHwang, Sung Ju-
dc.contributor.nonIdAuthorKenji, Kawaguchi-
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0