Recognizing affect in human touch of a robot

dc.contributor.authorAltun, Kerem
dc.contributor.authorMacLean, Karon E.
dc.date.accessioned2021-05-15T12:37:34Z
dc.date.available2021-05-15T12:37:34Z
dc.date.issued2015
dc.departmentMühendislik ve Doğa Bilimleri Fakültesi, Makine Mühendisliği Bölümüen_US
dc.description1st International Workshop on Multimodal Pattern Recognition of Social Signals in Human Computer Interaction (MPRSS) -- NOV 11, 2012 -- Tsukuba, JAPAN
dc.descriptionAltun, Kerem/0000-0002-5493-8921
dc.description.abstractA pet cat or dog's ability to respond to our emotional state opens an interaction channel with high visceral impact, which social robots may also be able to access. Touch is a key but understudied element; here, we explore its emotional content in the context of a furry robot pet. We asked participants to imagine feeling nine emotions located in a 2-D arousal-valence affect space, then to express them by touching a lap-sized robot prototype equipped with pressure sensors and accelerometer. We found overall correct classification (Random Forests) within the 2-D grid of 36% (all participants combined) and 48% (average of participants classified individually); chance 11%. Rates rose to 56% in the high arousal zone. To better understand classifier performance, we defined and analyzed new metrics that better indicate closeness of the gestural expressions. We also present a method to combine direct affect recognition with affect inferred from gesture recognition. This analysis provides a unique first insight into the nature and quality of affective touch, with implications as a design tool and for incorporating unintrusive affect sensing into deployed interactions. (C) 2014 Elsevier B.V. All rights reserved.en_US
dc.description.sponsorshipInt Assoc Pattern Recognit (IAPR), TC3en_US
dc.description.sponsorshipNSERCNatural Sciences and Engineering Research Council of Canada (NSERC)en_US
dc.description.sponsorshipWe thank our experiment participants, and Dr. M. Sedlmair for visualization advice. This work was supported by NSERC.en_US
dc.identifier.doi10.1016/j.patrec.2014.10.016
dc.identifier.endpage40en_US
dc.identifier.issn0167-8655
dc.identifier.issn1872-7344
dc.identifier.scopus2-s2.0-84943148161
dc.identifier.scopusqualityQ1
dc.identifier.startpage31en_US
dc.identifier.urihttps://doi.org/10.1016/j.patrec.2014.10.016
dc.identifier.urihttps://hdl.handle.net/20.500.12939/556
dc.identifier.volume66en_US
dc.identifier.wosWOS:000362271100005
dc.identifier.wosqualityQ2
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.institutionauthorAltun, Kerem
dc.language.isoen
dc.publisherElsevieren_US
dc.relation.ispartofPattern Recognition Letters
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectAffective interfacesen_US
dc.subjectHapticen_US
dc.subjectHuman Robot Interactionen_US
dc.subjectAffect Recognitionen_US
dc.subjectGesture Recognitionen_US
dc.titleRecognizing affect in human touch of a robot
dc.typeConference Object

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
KEREM.pdf
Boyut:
1.79 MB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin/ Full Text