DC Field | Value | Language |
---|---|---|
dc.contributor.author | Moon, Woohyeon | ko |
dc.contributor.author | Kim, Taeyoung | ko |
dc.contributor.author | Park, Bumgeun | ko |
dc.contributor.author | Har, Dongsoo | ko |
dc.date.accessioned | 2023-12-20T08:00:17Z | - |
dc.date.available | 2023-12-20T08:00:17Z | - |
dc.date.created | 2023-11-30 | - |
dc.date.issued | 2023-12-02 | - |
dc.identifier.citation | The 37th Pacific Asia Conference on Language, Information and Computation, PACLIC 37 | - |
dc.identifier.uri | http://hdl.handle.net/10203/316729 | - |
dc.description.abstract | Transformer is a state-of-the-art model in the field of natural language processing (NLP). Current NLP models primarily increase the number of transformers to improve processing performance. However, this technique requires a lot of training resources such as computing capacity. In this paper, a novel structure of Transformer is proposed. It is featured by full layer normalization, weighted residual connection, positional encoding exploiting reinforcement learning, and zero masked self-attention. The proposed Transformer model, which is called Enhanced Transformer, is validated by the bilingual evaluation understudy (BLEU) score obtained with the Multi30k translation dataset. As a result, the Enhanced Transformer achieves 202.96% higher BLEU score as compared to the original transformer with the translation dataset. | - |
dc.language | English | - |
dc.publisher | Pacific Asia Conference on Language, Information and Computation | - |
dc.title | Enhanced Transformer Architecture for Natural Language Processing | - |
dc.type | Conference | - |
dc.type.rims | CONF | - |
dc.citation.publicationname | The 37th Pacific Asia Conference on Language, Information and Computation, PACLIC 37 | - |
dc.identifier.conferencecountry | CC | - |
dc.identifier.conferencelocation | Hong Kong Polytechnic University | - |
dc.contributor.localauthor | Har, Dongsoo | - |
dc.contributor.nonIdAuthor | Moon, Woohyeon | - |
dc.contributor.nonIdAuthor | Kim, Taeyoung | - |
dc.contributor.nonIdAuthor | Park, Bumgeun | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.