Fast end-to-end coreference resolution for Korean

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 66
  • Download : 0
Recently, end-to-end neural network-based approaches have shown significant improvements over traditional pipeline-based models in English coreference resolution. However, such advancements came at a cost of computational complexity and recent works have not focused on tackling this problem. Hence, in this paper, to cope with this issue, we propose BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages. Applying this model to the Korean coreference resolution, we significantly reduce the coreference linking search space. Combining this with Ensemble Knowledge Distillation, we maintain state-of-the-art performance 66.9% of CoNLL F1 on ETRI test set while achieving 2x speedup (30 doc/sec) in document processing time.
Publisher
Association for Computational Linguistics (ACL)
Issue Date
2020-11
Language
English
Citation

Findings of the Association for Computational Linguistics: EMNLP 2020, pp.2610 - 2624

URI
http://hdl.handle.net/10203/310347
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0