DC Field | Value | Language |
---|---|---|
dc.contributor.author | Thorne, James | ko |
dc.contributor.author | Vlachos, Andreas | ko |
dc.date.accessioned | 2022-12-26T07:01:42Z | - |
dc.date.available | 2022-12-26T07:01:42Z | - |
dc.date.created | 2022-12-23 | - |
dc.date.issued | 2021-04-20 | - |
dc.identifier.citation | 16th Conference of the European Chapter of the Associationfor Computational Linguistics, EACL 2021, pp.957 - 964 | - |
dc.identifier.uri | http://hdl.handle.net/10203/303701 | - |
dc.description.abstract | The biases present in training datasets have been shown to affect models for sentence pair classification tasks such as natural language inference (NLI) and fact verification. While fine-tuning models on additional data has been used to mitigate them, a common issue is that of catastrophic forgetting of the original training dataset. In this paper, we show that elastic weight consolidation (EWC) allows fine-tuning of models to mitigate biases while being less susceptible to catastrophic forgetting. In our evaluation on fact verification and NLI stress tests, we show that fine-tuning with EWC dominates standard fine-tuning, yielding models with lower levels of forgetting on the original (biased) dataset for equivalent gains in accuracy on the fine-tuning (unbiased) dataset. | - |
dc.language | English | - |
dc.publisher | Association for Computational Linguistics (ACL) | - |
dc.title | Elastic weight consolidation for better bias inoculation | - |
dc.type | Conference | - |
dc.identifier.wosid | 000863557001004 | - |
dc.identifier.scopusid | 2-s2.0-85106093811 | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 957 | - |
dc.citation.endingpage | 964 | - |
dc.citation.publicationname | 16th Conference of the European Chapter of the Associationfor Computational Linguistics, EACL 2021 | - |
dc.identifier.conferencecountry | UI | - |
dc.identifier.conferencelocation | Virtual | - |
dc.contributor.localauthor | Thorne, James | - |
dc.contributor.nonIdAuthor | Vlachos, Andreas | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.