Hand-object pose estimation via interaction-aware graph attention mechanism상호작용을 고려한 그래프 어텐션 방법을 통한 손-물체 포즈 추정 방법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 4
  • Download : 0
Estimating hand and object pose from images has emerged as a promising research field due to the increasing demand for practical applications in virtual and augmented reality. The primary objective of this research is to understand the interaction between a hand and an object. To capture the intricate interaction, most existing approaches estimate meshes of the hand and object to represent hand-object interaction, such as contact region. Recently, graph neural networks (GNNs) is used to leverage the graph-like structure of the hand and object meshes, enabling the incorporation of spatial information during inference. However, existing GNN-based methods have not fully exploited the potential of these graphs by not changing their connectivity (edges) within and between different classes. In this thesis, we propose a graph-based refinement method that improves initially estimated hand and object meshes from an image. Our method considers hand-object interaction via an interaction-aware graph attention mechanism, connecting highly-correlated nodes between intra-class graphs and between inter-class graphs. Experiments demonstrate that our proposed method improves accuracy in estimating hand and object pose, and aspects of hand-object interaction.
Advisors
박진아researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2023.8,[iv, 30 p. :]

Keywords

손-물체 포즈 추정▼a손-물체 상호작용▼a컴퓨터 비전▼a그래프 신경망; Hand-object pose estimatoin▼ahand-object interaction▼acomputer vision▼agraph neural network

URI
http://hdl.handle.net/10203/320720
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1045952&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0