Altun, KeremMacLean, Karon E.2021-05-152021-05-1520150167-86551872-7344https://doi.org/10.1016/j.patrec.2014.10.016https://hdl.handle.net/20.500.12939/5561st International Workshop on Multimodal Pattern Recognition of Social Signals in Human Computer Interaction (MPRSS) -- NOV 11, 2012 -- Tsukuba, JAPANAltun, Kerem/0000-0002-5493-8921A pet cat or dog's ability to respond to our emotional state opens an interaction channel with high visceral impact, which social robots may also be able to access. Touch is a key but understudied element; here, we explore its emotional content in the context of a furry robot pet. We asked participants to imagine feeling nine emotions located in a 2-D arousal-valence affect space, then to express them by touching a lap-sized robot prototype equipped with pressure sensors and accelerometer. We found overall correct classification (Random Forests) within the 2-D grid of 36% (all participants combined) and 48% (average of participants classified individually); chance 11%. Rates rose to 56% in the high arousal zone. To better understand classifier performance, we defined and analyzed new metrics that better indicate closeness of the gestural expressions. We also present a method to combine direct affect recognition with affect inferred from gesture recognition. This analysis provides a unique first insight into the nature and quality of affective touch, with implications as a design tool and for incorporating unintrusive affect sensing into deployed interactions. (C) 2014 Elsevier B.V. All rights reserved.eninfo:eu-repo/semantics/openAccessAffective interfacesHapticHuman Robot InteractionAffect RecognitionGesture RecognitionRecognizing affect in human touch of a robotConference Object10.1016/j.patrec.2014.10.0166631402-s2.0-84943148161Q1WOS:000362271100005Q2