Wearable textile input device with multimodal sensing for eyes-free mobile interaction during daily activities

Cited 10 time in webofscience Cited 0 time in scopus
  • Hit : 203
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorYoon, Sang Hoko
dc.contributor.authorHuo, Keko
dc.contributor.authorRamani, Karthikko
dc.date.accessioned2021-09-17T01:10:06Z-
dc.date.available2021-09-17T01:10:06Z-
dc.date.created2021-09-17-
dc.date.created2021-09-17-
dc.date.created2021-09-17-
dc.date.created2021-09-17-
dc.date.created2021-09-17-
dc.date.issued2016-12-
dc.identifier.citationPERVASIVE AND MOBILE COMPUTING, v.33, pp.17 - 31-
dc.identifier.issn1574-1192-
dc.identifier.urihttp://hdl.handle.net/10203/287804-
dc.description.abstractAs pervasive computing is widely available during daily activities, wearable input devices which promote an eyes-free interaction are needed for easy access and safety. We propose a textile wearable device which enables a multimodal sensing input for an eyes free mobile interaction during daily activities. Although existing input devices possess multimodal sensing capabilities with a small form factor, they still suffer from deficiencies in compactness and softness due to the nature of embedded materials and components. For our prototype, we paint a conductive silicone rubber on a single layer of textile and stitch conductive threads. From a single layer of the textile, multimodal sensing (strain and pressure) values are extracted via voltage dividers. A regression analysis, multi-level thresholding and a temporal position tracking algorithm are applied to capture the different levels and modes of finger interactions to support the input taxonomy. We then demonstrate example applications with interaction design allowing users to control existing mobile, wearable, and digital devices. The evaluation results confirm that the prototype can achieve an accuracy of >= 80% for demonstrating all input types, >= 88% for locating the specific interaction areas for eyes-free interaction, and the robustness during daily activity related motions. Multitasking study reveals that our prototype promotes relatively fast response with low perceived workload comparing to existing eyes-free input.-
dc.languageEnglish-
dc.publisherELSEVIER-
dc.titleWearable textile input device with multimodal sensing for eyes-free mobile interaction during daily activities-
dc.typeArticle-
dc.identifier.wosid000390637300002-
dc.identifier.scopusid2-s2.0-84966508512-
dc.type.rimsART-
dc.citation.volume33-
dc.citation.beginningpage17-
dc.citation.endingpage31-
dc.citation.publicationnamePERVASIVE AND MOBILE COMPUTING-
dc.identifier.doi10.1016/j.pmcj.2016.04.008-
dc.contributor.localauthorYoon, Sang Ho-
dc.contributor.nonIdAuthorHuo, Ke-
dc.contributor.nonIdAuthorRamani, Karthik-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorWearables-
dc.subject.keywordAuthorFinger augmentation-
dc.subject.keywordAuthorSmart textile-
dc.subject.keywordAuthorMobile interaction-
dc.subject.keywordAuthorEyes-free input-
dc.subject.keywordPlusSENSOR-
dc.subject.keywordPlusDESIGN-
Appears in Collection
GCT-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 10 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0