Deep motion refinement for stitched locomotion딥러닝 기반 스티칭된 보행 애니메이션 수정

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 67
  • Download : 0
Although motion capture technologies or manual keyframing guarantee high quality results, every possible motion data cannot be created with those methods because it is highly time consuming. Therefore, a few crucial animations are created and then connected with motion blending algorithms to produce motion sequences needed for the final product. However, conventional motion blending algorithms based on interpolation struggle to produce natural locomotion in some cases. To overcome this limitation, we propose a data-driven motion refinement framework based on recurrent neural networks(RNN). Our framework takes a naively stitched locomotion sequence as input, and refines it to a natural locomotion sequence. We also designed a novel data pair generation based on nearest neighbor search. After training, our model produces higher quality results compared to a conventional motion blending algorithm by adjusting the steps rather than simply interpolating positions or rotations. The trained model is also capable of refining various lengths of motion sequences, which allows it to be applied to many applications that require natural motion stitching.
Advisors
Noh, Junyongresearcher노준용researcher
Description
한국과학기술원 :문화기술대학원,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 문화기술대학원, 2022.8,[iii, 19 p. :]

Keywords

character animation▼ahuman locomotion▼adeep learning▼arecurrent neural network▼aGRU▼amotion stitching; 캐릭터 애니메이션▼a보행 애니메이션▼a딥러닝▼a순환 신경망▼a움직임 연결

URI
http://hdl.handle.net/10203/308285
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1008240&flag=dissertation
Appears in Collection
GCT-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0