Walking-in-place for omnidirectional VR locomotion using a single RGB camera

Cited 4 time in webofscience Cited 0 time in scopus
  • Hit : 1401
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Woojooko
dc.contributor.authorSung, Jaehoko
dc.contributor.authorXiong, Shupingko
dc.date.accessioned2022-02-27T06:41:15Z-
dc.date.available2022-02-27T06:41:15Z-
dc.date.created2021-06-23-
dc.date.created2021-06-23-
dc.date.created2021-06-23-
dc.date.issued2022-03-
dc.identifier.citationVIRTUAL REALITY, v.26, no.1, pp.173 - 186-
dc.identifier.issn1359-4338-
dc.identifier.urihttp://hdl.handle.net/10203/292435-
dc.description.abstractLocomotion is a fundamental interaction element allowing navigation inside the virtual environment, and the walking-in-place (WIP) techniques have been actively developed as a balanced compromise between naturalness and efficiency. One popular method to implement the WIP technique was to use a low-cost, easy to set up, and markerless Kinect, but required integration of multiple sensors or covered limited directions due to the poor tracking capability when facing non-frontal sides of the user. This study aimed to propose a WIP technique for omnidirectional VR locomotion based on a single RGB camera, utilizing an open-source 2D human pose estimation system called OpenPose. Three WIP techniques (existing Kinect-based technique, proposed Kinect-based technique, and proposed OpenPose-based technique) were compared in terms of variation of virtual walking speed and subjective evaluation through a user study with walking tasks in different directions. Experimental results showed that the proposed OpenPose-based technique performed comparably when the user faced the front of the camera, but it induced lower variation of virtual walking speed and higher subjective evaluation ratings at non-forward directions compared to other techniques. The proposed OpenPose-based WIP technique can be used in VR applications to provide a fully unobstructed VR locomotion experience. It can achieve stable WIP-based omnidirectional VR locomotion through a single low-cost easily accessible RGB camera, without the need for additional sensors, and at the same time, both hands are free for other interactions.-
dc.languageEnglish-
dc.publisherSPRINGER LONDON LTD-
dc.titleWalking-in-place for omnidirectional VR locomotion using a single RGB camera-
dc.typeArticle-
dc.identifier.wosid000663269400002-
dc.identifier.scopusid2-s2.0-85108188980-
dc.type.rimsART-
dc.citation.volume26-
dc.citation.issue1-
dc.citation.beginningpage173-
dc.citation.endingpage186-
dc.citation.publicationnameVIRTUAL REALITY-
dc.identifier.doi10.1007/s10055-021-00551-0-
dc.contributor.localauthorXiong, Shuping-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorWalking in place-
dc.subject.keywordAuthorVR locomotion-
dc.subject.keywordAuthorNavigation control-
dc.subject.keywordAuthorVirtual environment-
dc.subject.keywordAuthorOpenPose-
dc.subject.keywordAuthorKinect-
dc.subject.keywordPlusORIENTATIONTREADMILLTRAVEL-
Appears in Collection
IE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 4 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0