Paraphrase Bidirectional Transformer with Multi-task Learning

Cited 4 time in webofscience Cited 4 time in scopus
  • Hit : 122
  • Download : 0
It is very important to analyze the semantic similarity of two sentences for Natural Language Processing (NLP). This paper proposes a Paraphrase-BERT to perform Paraphrase Identification task. We first fine-tune the pre-trained BERT with MRPC data and add a Whole Word Masking, which is pre-training method recently announced by Google, to the BERT. Finally, we perform Multi-Task Learning (MLT) to improve performance. Specifically, the Question Answering task and the Paraphrase Identification (PI) task are learned sequentially to improve performance of PI task. As a result, it has shown that MLT affect a performance improvement of downstream task (11.11% point absolute accuracy improvement, 7.88% point absolute F1 improvement).
Publisher
IEEE,Korean Institute of Information Scientists and Engineers (KIISE)
Issue Date
2020-02-21
Language
English
Citation

2020 IEEE International Conference on Big Data and Smart Computing, BigComp 2020, pp.217 - 220

ISSN
2375-933X
DOI
10.1109/bigcomp48618.2020.00-72
URI
http://hdl.handle.net/10203/277242
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 4 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0