StickyPie: A Gaze-Based, Scale-Invariant Marking Menu Optimized for AR/VR

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 53
  • Download : 0
This work explores the design of marking menus for gaze-based AR/VR menu selection by expert and novice users. It frst identifes and explains the challenges inherent in ocular motor control and current eye tracking hardware, including overshooting, incorrect selections, and false activations. Through three empirical studies, we optimized and validated design parameters to mitigate these errors while reducing completion time, task load, and eye fatigue. Based on the fndings from these studies, we derived a set of design guidelines to support gaze-based marking menus in AR/VR. To overcome the overshoot errors found with eye-based expert marking menu behaviour, we developed StickyPie, a marking menu technique that enables scale-independent marking input by estimating saccade landing positions. An evaluation of StickyPie revealed that StickyPie was easier to learn than the traditional technique (i.e., RegularPie) and was 10% more efcient after 3 sessions.
Publisher
ACM
Issue Date
2021-05
Language
English
Citation

CHI '21: CHI Conference on Human Factors in Computing Systems

DOI
10.1145/3411764.3445297
URI
http://hdl.handle.net/10203/312351
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0