A substrate-less nanomesh receptor with meta-learning for rapid hand task recognition

Cited 80 time in webofscience Cited 0 time in scopus
  • Hit : 680
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Kyun Kyuko
dc.contributor.authorKim, Minko
dc.contributor.authorPyun, Kyungrokko
dc.contributor.authorKim, Jinko
dc.contributor.authorMin, Jinkiko
dc.contributor.authorKoh, Seunghunko
dc.contributor.authorRoot, Samuel E.ko
dc.contributor.authorKim, Jaewonko
dc.contributor.authorNguyen, Bao-Nguyen T.ko
dc.contributor.authorNishio, Yuyako
dc.contributor.authorHan, Seonggeunko
dc.contributor.authorChoi, Joonhwako
dc.contributor.authorKim, C-Yoonko
dc.contributor.authorTok, Jeffrey B.-H.ko
dc.contributor.authorJo, Sung-Hoko
dc.contributor.authorKo, Seung Hwanko
dc.contributor.authorBao, Zhenanko
dc.date.accessioned2023-02-07T01:01:45Z-
dc.date.available2023-02-07T01:01:45Z-
dc.date.created2023-01-13-
dc.date.issued2023-01-
dc.identifier.citationNATURE ELECTRONICS, v.6, no.1, pp.64 - 75-
dc.identifier.issn2520-1131-
dc.identifier.urihttp://hdl.handle.net/10203/305053-
dc.description.abstractWith the help of machine learning, electronic devices—including electronic gloves and electronic skins—can track the movement of human hands and perform tasks such as object and gesture recognition. However, such devices remain bulky and lack an ability to adapt to the curvature of the body. Furthermore, existing models for signal processing require large amounts of labelled data for recognizing individual tasks for every user. Here we report a substrate-less nanomesh receptor that is coupled with an unsupervised meta-learning framework and can provide user-independent, data-efficient recognition of different hand tasks. The nanomesh, which is made from biocompatible materials and can be directly printed on a person’s hand, mimics human cutaneous receptors by translating electrical resistance changes from fine skin stretches into proprioception. A single nanomesh can simultaneously measure finger movements from multiple joints, providing a simple user implementation and low computational cost. We also develop a time-dependent contrastive learning algorithm that can differentiate between different unlabelled motion signals. This meta-learned information is then used to rapidly adapt to various users and tasks, including command recognition, keyboard typing and object recognition. © 2022, The Author(s), under exclusive licence to Springer Nature Limited.-
dc.languageEnglish-
dc.publisherNATURE PUBLISHING GROUP-
dc.titleA substrate-less nanomesh receptor with meta-learning for rapid hand task recognition-
dc.typeArticle-
dc.identifier.wosid000905510900002-
dc.identifier.scopusid2-s2.0-85145091365-
dc.type.rimsART-
dc.citation.volume6-
dc.citation.issue1-
dc.citation.beginningpage64-
dc.citation.endingpage75-
dc.citation.publicationnameNATURE ELECTRONICS-
dc.identifier.doi10.1038/s41928-022-00888-7-
dc.contributor.localauthorJo, Sung-Ho-
dc.contributor.nonIdAuthorKim, Kyun Kyu-
dc.contributor.nonIdAuthorPyun, Kyungrok-
dc.contributor.nonIdAuthorKim, Jin-
dc.contributor.nonIdAuthorMin, Jinki-
dc.contributor.nonIdAuthorRoot, Samuel E.-
dc.contributor.nonIdAuthorKim, Jaewon-
dc.contributor.nonIdAuthorNguyen, Bao-Nguyen T.-
dc.contributor.nonIdAuthorNishio, Yuya-
dc.contributor.nonIdAuthorHan, Seonggeun-
dc.contributor.nonIdAuthorChoi, Joonhwa-
dc.contributor.nonIdAuthorKim, C-Yoon-
dc.contributor.nonIdAuthorTok, Jeffrey B.-H.-
dc.contributor.nonIdAuthorKo, Seung Hwan-
dc.contributor.nonIdAuthorBao, Zhenan-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordPlusELECTRONICS-
dc.subject.keywordPlusLIGHTWEIGHT-
dc.subject.keywordPlusMOVEMENTS-
dc.subject.keywordPlusMOTION-
dc.subject.keywordPlusSENSOR-
dc.subject.keywordPlusSKIN-
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 80 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0