Exploiting Numerical-Contextual Knowledge to Improve Numerical Reasoning in Question Answering

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 223
  • Download : 0
Numerical reasoning over text is a challenging subtask in question answering (QA) that requires both the understanding of texts and numbers. However, existing language models in these numerical reasoning QA models tend to overly rely on the pre-existing parametric knowledge at inference time, which commonly causes hallucination in interpreting numbers. Our work proposes a novel attention masked reasoning model, the NC-BERT, that learns to leverage the number-related contextual knowledge to alleviate the over-reliance on parametric knowledge and enhance the numerical reasoning capabilities of the QA model. The empirical results suggest that understanding of numbers in their context by reducing the parametric knowledge influence, and refining numerical information in the number embeddings lead to improved numerical reasoning accuracy and performance in DROP, a numerical QA dataset.
Publisher
Association for Computational Linguistics (ACL)
Issue Date
2022-07
Language
English
Citation

The 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp.1811 - 1821

URI
http://hdl.handle.net/10203/299039
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0