Enhanced Transformer Architecture for Natural Language Processing

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 142
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorMoon, Woohyeonko
dc.contributor.authorKim, Taeyoungko
dc.contributor.authorPark, Bumgeunko
dc.contributor.authorHar, Dongsooko
dc.date.accessioned2023-12-20T08:00:17Z-
dc.date.available2023-12-20T08:00:17Z-
dc.date.created2023-11-30-
dc.date.issued2023-12-02-
dc.identifier.citationThe 37th Pacific Asia Conference on Language, Information and Computation, PACLIC 37-
dc.identifier.urihttp://hdl.handle.net/10203/316729-
dc.description.abstractTransformer is a state-of-the-art model in the field of natural language processing (NLP). Current NLP models primarily increase the number of transformers to improve processing performance. However, this technique requires a lot of training resources such as computing capacity. In this paper, a novel structure of Transformer is proposed. It is featured by full layer normalization, weighted residual connection, positional encoding exploiting reinforcement learning, and zero masked self-attention. The proposed Transformer model, which is called Enhanced Transformer, is validated by the bilingual evaluation understudy (BLEU) score obtained with the Multi30k translation dataset. As a result, the Enhanced Transformer achieves 202.96% higher BLEU score as compared to the original transformer with the translation dataset.-
dc.languageEnglish-
dc.publisherPacific Asia Conference on Language, Information and Computation-
dc.titleEnhanced Transformer Architecture for Natural Language Processing-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameThe 37th Pacific Asia Conference on Language, Information and Computation, PACLIC 37-
dc.identifier.conferencecountryCC-
dc.identifier.conferencelocationHong Kong Polytechnic University-
dc.contributor.localauthorHar, Dongsoo-
dc.contributor.nonIdAuthorMoon, Woohyeon-
dc.contributor.nonIdAuthorKim, Taeyoung-
dc.contributor.nonIdAuthorPark, Bumgeun-
Appears in Collection
GT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0