DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 김우창 | - |
dc.contributor.author | Choi, Seokhwan | - |
dc.contributor.author | 최석환 | - |
dc.date.accessioned | 2024-07-25T19:30:50Z | - |
dc.date.available | 2024-07-25T19:30:50Z | - |
dc.date.issued | 2023 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1045748&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/320560 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 데이터사이언스대학원, 2023.8,[iii, 20 p. :] | - |
dc.description.abstract | Differentiable Sorting Algorithm is used in end-to-end differentiable frameworks, enabling gradient-based optimization of models that involve sorting operations. The Differentiable Sorting Network, the most recent state-of-the-art Differentiable Sorting Algorithm, necessitates an equal gap between input scalars for accurate sorting. We consider the sorting operation as a seq2seq generation task, where the input sequence consists of unsorted scalars, and the output sequence represents the argsort result of the unsorted scalars. From that perspective, we present TranSort, a transformer architecture proposed as an alternative to Differentiable Sorting Algorithm. TranSort demonstrates stable sorting performance on various distribution of input scalars, distinguishing itself from Differentiable Sorting Network. Moreover, we present empirical evidence highlighting the enhanced performance of end-to-end learning tasks when utilizing TranSort compared to previous Differentiable Sorting Algorithms. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | 미분가능한 정렬▼a트랜스포머▼a종단 간 학습▼a시퀀스-투-시퀀스 생성 | - |
dc.subject | Differentiable sorting▼aTransformer▼aEnd-to-end learning▼aseq2seq generation | - |
dc.title | TranSort: transformer for differentiable sorting | - |
dc.title.alternative | 미분가능한 정렬을 위한 트랜스포머 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :데이터사이언스대학원, | - |
dc.contributor.alternativeauthor | Kim, Woo Chang | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.