Online Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 116
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKu, Hyun Jinko
dc.contributor.authorChoi, Jason J.ko
dc.contributor.authorJang, Sunhoko
dc.contributor.authorDo, Wonkyungko
dc.contributor.authorLee, Soominko
dc.contributor.authorSeok, Sangokko
dc.date.accessioned2020-06-29T07:20:35Z-
dc.date.available2020-06-29T07:20:35Z-
dc.date.created2020-06-17-
dc.date.created2020-06-17-
dc.date.issued2019-06-
dc.identifier.citation16th International Conference on Ubiquitous Robots (UR), pp.271 - 277-
dc.identifier.issn2325-033X-
dc.identifier.urihttp://hdl.handle.net/10203/274978-
dc.description.abstractThe capability of recognizing various social touch patterns is necessary for robots functioning for touch-based social interaction, which is effective in many robot applications. Literature has focused on the novelty of the recognition system or improvements in classification accuracy based on publicly available datasets. In this paper, we propose an integrated framework of implementing social touch recognition system for various robots, which consists of three complementary principles: 1) multi-modal tactile sensing, 2) a modular design, and 3) a social touch pattern classifier capable of learning temporal features. The approach is evaluated by an implemented Multimodal-sensing Modular Tactile Interface prototype, while for the classifiers, three learning methods-HMM, LSTM, and 3D-CNN-have been tested. The trained classifiers, which can run online in robot's embedded system, predict 18 classes of social touch pattern. Results of the online validation test offer that all three methods are promising with the best accuracy of 88.86%. Especially, the stable performance of 3D-CNN indicates that learning 'spatiotemporal' features from tactile data would be more effective. Through this validation process, we have confirmed that our framework can be easily adopted and secures robust performance for social touch pattern recognition.-
dc.languageEnglish-
dc.publisherIEEE-
dc.titleOnline Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface-
dc.typeConference-
dc.identifier.wosid000493109000049-
dc.identifier.scopusid2-s2.0-85070566136-
dc.type.rimsCONF-
dc.citation.beginningpage271-
dc.citation.endingpage277-
dc.citation.publicationname16th International Conference on Ubiquitous Robots (UR)-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationSOUTH KOREA-
dc.identifier.doi10.1109/URAI.2019.8768706-
dc.contributor.nonIdAuthorChoi, Jason J.-
dc.contributor.nonIdAuthorJang, Sunho-
dc.contributor.nonIdAuthorDo, Wonkyung-
dc.contributor.nonIdAuthorLee, Soomin-
dc.contributor.nonIdAuthorSeok, Sangok-
Appears in Collection
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0