(A) relevance-based transformer architecture for information retrieval정보 검색을 위한 관련성 중심의 트렌스포머 구조

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 137
  • Download : 0
BERT, a deep bidirectional transformer architecture, builds a language model that achieved the state-of-art performance in several general NLP tasks, mainly due to its self-attention mechanism that captures rich, context-sensitive word and sentence semantics from a large corpus. However, a task like news search can be sensitive to a change of word meanings arising after a major event (e.g. the meaning representation of \Las Vegas" after the shooting event). In order to capture such changes and generate a search-oriented language model, we propose two di erent language models. Our rst model applies the global term weighting scaling to the embedding layer so that the new BERT model can capture more relevant relationships than before. We also proposed a new relevance-based attention head method that attempts to incorporate user relevance decision and hence perspective changes reected in click-through data. Based on a series of experiments, we show that our second model can capture relevance relationship between words better and give superior retrieval performance in news retrieval.
Advisors
Myaeng, Sung Hyonresearcher맹성현researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2019
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2019.8,[iv, 24 p. :]

Keywords

Information retrieval▼alanguage model▼anatural language processing; 정보 검색▼a언어 모델▼a자연어 처리

URI
http://hdl.handle.net/10203/283080
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=875456&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0