How do you feel online? Exploiting smartphone sensors to detect transitory emotions during social media use

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 319
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorRuensuk, Mko
dc.contributor.authorCheon, Eko
dc.contributor.authorHong, Hwajungko
dc.contributor.authorIan Roland, Oakleyko
dc.date.accessioned2021-08-04T08:10:03Z-
dc.date.available2021-08-04T08:10:03Z-
dc.date.created2021-08-04-
dc.date.created2021-08-04-
dc.date.created2021-08-04-
dc.date.created2021-08-04-
dc.date.created2021-08-04-
dc.date.issued2020-12-
dc.identifier.citationPROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, v.4, no.4-
dc.identifier.issn2474-9567-
dc.identifier.urihttp://hdl.handle.net/10203/287034-
dc.description.abstractEmotions are an intrinsic part of the social media user experience that can evoke negative behaviors such as cyberbullying and trolling. Detecting the emotions of social media users may enable responding to and mitigating these problems. Prior work suggests this may be achievable on smartphones: emotions can be detected via built-in sensors during prolonged input tasks. We extend these ideas to a social media context featuring sparse input interleaved with more passive browsing and media consumption activities. To achieve this, we present two studies. In the first, we elicit participant's emotions using images and videos and capture sensor data from a mobile device, including data from a novel passive sensor: its built-in eye-Tracker. Using this data, we construct machine learning models that predict self-reported binary affect, achieving 93.20% peak accuracy. A follow-up study extends these results to a more ecologically valid scenario in which participants browse their social media feeds the study yields high accuracies for both self-reported binary valence (94.16%) and arousal (92.28%). We present a discussion of the sensors, features and study design choices that contribute to this high performance and that future designers and researchers can use to create effective and accurate smartphone-based affect detection systems.-
dc.languageEnglish-
dc.publisherASSOC COMPUTING MACHINERY-
dc.titleHow do you feel online? Exploiting smartphone sensors to detect transitory emotions during social media use-
dc.typeArticle-
dc.identifier.scopusid2-s2.0-85098160008-
dc.type.rimsART-
dc.citation.volume4-
dc.citation.issue4-
dc.citation.publicationnamePROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT-
dc.identifier.doi10.1145/3432223-
dc.contributor.localauthorHong, Hwajung-
dc.contributor.localauthorIan Roland, Oakley-
dc.contributor.nonIdAuthorRuensuk, M-
dc.contributor.nonIdAuthorCheon, E-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthoraffective computing-
dc.subject.keywordAuthoremotion detection-
dc.subject.keywordAuthorclassification-
dc.subject.keywordAuthorsmartphones-
dc.subject.keywordAuthorsocial media-
Appears in Collection
ID-Journal Papers(저널논문)EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0