Sound design for emotion and intention expression of socially interactive robots

Cited 31 time in webofscience Cited 0 time in scopus
  • Hit : 346
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorJee, Eun Sookko
dc.contributor.authorJeong, Yong-Jeonko
dc.contributor.authorKim, Chong Huiko
dc.contributor.authorKobayashi, Hisatoko
dc.date.accessioned2013-03-09T00:32:36Z-
dc.date.available2013-03-09T00:32:36Z-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.issued2010-07-
dc.identifier.citationINTELLIGENT SERVICE ROBOTICS, v.3, no.3, pp.199 - 206-
dc.identifier.issn1861-2776-
dc.identifier.urihttp://hdl.handle.net/10203/94821-
dc.description.abstractThe current concept of robots has been greatly influenced by the image of robots from science fiction. Since robots were introduced into human society as partners the importance of human-robot interaction has grown. In this paper, we have designed seven musical sounds, five of which express intention and two that express emotion for the English teacher robot, Silbot. To identify the sound design considerations, we analyzed the sounds of robots, R2-D2 and Wall-E, from two popular movies, Star Wars and Wall-E, respectively. From the analysis, we found that intonation, pitch, and timbre are dominant musical parameters to express intention and emotion. To check the validity of these designed sounds for intention and emotion, we performed a recognition rate experiment. The experiment showed that the five designed sounds for intentions and the two for emotions are sufficient to deliver the intended emotions.-
dc.languageEnglish-
dc.publisherSPRINGER SCIENCE + BUSINESS MEDIA-
dc.titleSound design for emotion and intention expression of socially interactive robots-
dc.typeArticle-
dc.identifier.wosid000510890000007-
dc.identifier.scopusid2-s2.0-77953914365-
dc.type.rimsART-
dc.citation.volume3-
dc.citation.issue3-
dc.citation.beginningpage199-
dc.citation.endingpage206-
dc.citation.publicationnameINTELLIGENT SERVICE ROBOTICS-
dc.identifier.doi10.1007/s11370-010-0070-7-
dc.contributor.nonIdAuthorJee, Eun Sook-
dc.contributor.nonIdAuthorJeong, Yong-Jeon-
dc.contributor.nonIdAuthorKobayashi, Hisato-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorEmotion expression-
dc.subject.keywordAuthorHuman-robot interaction-
dc.subject.keywordAuthorMusical sound-
dc.subject.keywordAuthorRobot-
Appears in Collection
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 31 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0