MyoKey: Inertial Motion Sensing and Gesture-Based QWERTY Keyboard for Extended Realities

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 187
  • Download : 0
Usability challenges and social acceptance of textual input in a context of extended realities (XR) motivate the research of novel input modalities. We investigate the fusion of inertial measurement unit (IMU) control and surface electromyography (sEMG) gesture recognition applied to text entry using a QWERTY-layout virtual keyboard. We design, implement, and evaluate the proposed multi-modal solution named MyoKey. The user can select characters with a combination of arm movements and hand gestures. MyoKey employs a lightweight convolutional neural network classifier that can be deployed on a mobile device with insignificant inference time. We demonstrate the practicality of interruption-free text entry with MyoKey, by recruiting 12 participants and by testing three sets of grasp micro-gestures in three scenarios: empty hand text input, tripod grasp (e.g., pen), and a cylindrical grasp (e.g., umbrella). With MyoKey, users achieve an average text entry rate of 9.33 words per minute (WPM), 8.76 WPM, and 8.35 WPM for the freehand, tripod grasp, and cylindrical grasp conditions, respectively.
Publisher
IEEE COMPUTER SOC
Issue Date
2023-08
Language
English
Article Type
Article
Citation

IEEE TRANSACTIONS ON MOBILE COMPUTING, v.22, no.8, pp.4807 - 4821

ISSN
1536-1233
DOI
10.1109/TMC.2022.3156939
URI
http://hdl.handle.net/10203/310997
Appears in Collection
IE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0