Weakly Supervised Pre-Training for Multi-Hop Retriever

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 100
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorSeonwoo, Yeonko
dc.contributor.authorLee, Sang-Wooko
dc.contributor.authorOh, Alice Haeyunko
dc.contributor.authorHa, Jung-Wooko
dc.contributor.authorKim, Ji-Hoonko
dc.date.accessioned2021-11-10T06:49:54Z-
dc.date.available2021-11-10T06:49:54Z-
dc.date.created2021-11-02-
dc.date.issued2021-08-
dc.identifier.citationThe Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021)-
dc.identifier.urihttp://hdl.handle.net/10203/289122-
dc.description.abstractIn multi-hop QA, answering complex questions entails iterative document retrieval for finding the missing entity of the question. The main steps of this process are sub-question detection, document retrieval for the subquestion, and generation of a new query for the final document retrieval. However, building a dataset that contains complex questions with sub-questions and their corresponding documents requires costly human annotation. To address the issue, we propose a new method for weakly supervised multi-hop retriever pretraining without human efforts. Our method includes 1) a pre-training task for generating vector representations of complex questions, 2) a scalable data generation method that produces the nested structure of question and subquestion as weak supervision for pre-training, and 3) a pre-training model structure based on dense encoders. We conduct experiments to compare the performance of our pre-trained retriever with several state-of-the-art models on end-to-end multi-hop QA as well as document retrieval. The experimental results show that our pre-trained retriever is effective and also robust on limited data and computational resources.-
dc.languageEnglish-
dc.publisherAssociation for Computational Linguistics (ACL 2021)-
dc.titleWeakly Supervised Pre-Training for Multi-Hop Retriever-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameThe Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021)-
dc.identifier.conferencecountryTH-
dc.identifier.conferencelocationThe Berkeley Hotel, Bangkok-
dc.contributor.localauthorOh, Alice Haeyun-
dc.contributor.nonIdAuthorLee, Sang-Woo-
dc.contributor.nonIdAuthorHa, Jung-Woo-
dc.contributor.nonIdAuthorKim, Ji-Hoon-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0