OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional Camera

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 60
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorYeo, Hui-Shyongko
dc.contributor.authorWu, Erwinko
dc.contributor.authorKim, Daehwako
dc.contributor.authorLee, Juyoungko
dc.contributor.authorKim, Hyung-ilko
dc.contributor.authorOh, Seo Youngko
dc.contributor.authorTakagi, Lunako
dc.contributor.authorWoo, Woontackko
dc.contributor.authorKoike, Hidekiko
dc.contributor.authorQuigley, Aaron Johnko
dc.date.accessioned2023-12-19T09:01:55Z-
dc.date.available2023-12-19T09:01:55Z-
dc.date.created2023-11-28-
dc.date.issued2023-04-23-
dc.identifier.citation2023 CHI Conference on Human Factors in Computing Systems, CHI 2023-
dc.identifier.urihttp://hdl.handle.net/10203/316696-
dc.description.abstractAn omni-directional (360°) camera captures the entire viewing sphere surrounding its optical center. Such cameras are growing in use to create highly immersive content and viewing experiences. When such a camera is held by a user, the view includes the user's hand grip, finger, body pose, face, and the surrounding environment, providing a complete understanding of the visual world and context around it. This capability opens up numerous possibilities for rich mobile input sensing. In OmniSense, we explore the broad input design space for mobile devices with a built-in omni-directional camera and broadly categorize them into three sensing pillars: i) near device ii) around device and iii) surrounding device. In addition we explore potential use cases and applications that leverage these sensing capabilities to solve user needs. Following this, we develop a working system to put these concepts into action, by leveraging these sensing capabilities to enable potential use cases and applications. We studied the system in a technical evaluation and a preliminary user study to gain initial feedback and insights. Collectively these techniques illustrate how a single, omni-purpose sensor on a mobile device affords many compelling ways to enable expressive input, while also affording a broad range of novel applications that improve user experience during mobile interaction.-
dc.languageEnglish-
dc.publisherACM-
dc.titleOmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional Camera-
dc.typeConference-
dc.identifier.wosid001037809502003-
dc.identifier.scopusid2-s2.0-85160015632-
dc.type.rimsCONF-
dc.citation.publicationname2023 CHI Conference on Human Factors in Computing Systems, CHI 2023-
dc.identifier.conferencecountryGE-
dc.identifier.conferencelocationHamburg-
dc.identifier.doi10.1145/3544548.3580747-
dc.contributor.localauthorWoo, Woontack-
dc.contributor.nonIdAuthorYeo, Hui-Shyong-
dc.contributor.nonIdAuthorWu, Erwin-
dc.contributor.nonIdAuthorKim, Daehwa-
dc.contributor.nonIdAuthorLee, Juyoung-
dc.contributor.nonIdAuthorKim, Hyung-il-
dc.contributor.nonIdAuthorTakagi, Luna-
dc.contributor.nonIdAuthorKoike, Hideki-
dc.contributor.nonIdAuthorQuigley, Aaron John-
Appears in Collection
GCT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0