Yazar "MacLean, Karon E." seçeneğine göre listele
Listeleniyor 1 - 2 / 2
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Real-time gait classification for persuasive smartphone apps: Structuring the literature and pushing the limits(2013) Schneider, Oliver; MacLean, Karon E.; Altun, Kerem; Karuei, Idin; Wu, Michael Ming-AnPersuasive technology is now mobile and context-aware. Intelligent analysis of accelerometer signals in smartphones and other specialized devices has recently been used to classify activity (e.g., distinguishing walking from cycling) to encourage physical activity, sustainable transport, and other social goals. Unfortunately, results vary drastically due to differences in methodology and problem domain. The present report begins by structuring a survey of current work within a new framework, which highlights comparable characteristics between studies; this provided a tool by which we and others can understand the current state-of-the art and guide research towards existing gaps. We then present a new user study, positioned in an identified gap, that pushes limits of current success with a challenging problem: the real-time classification of 15 similar and novel gaits suitable for several persuasive application areas, focused on the growing phenomenon of exercise games. We achieve a mean correct classification rate of 78.1% of all 15 gaits with a minimal amount of personalized training of the classifier for each participant when carried in any of 6 different carrying locations (not known a priori). When narrowed to a subset of four gaits and one location that is known, this improves to means of 92.2% with and 87.2% without personalization. Finally, we group our findings into design guidelines and quantify variation in accuracy when an algorithm is trained for a known location and participant. Copyright © 2013 ACM.Öğe Recognizing affect in human touch of a robot(Elsevier, 2015) Altun, Kerem; MacLean, Karon E.A pet cat or dog's ability to respond to our emotional state opens an interaction channel with high visceral impact, which social robots may also be able to access. Touch is a key but understudied element; here, we explore its emotional content in the context of a furry robot pet. We asked participants to imagine feeling nine emotions located in a 2-D arousal-valence affect space, then to express them by touching a lap-sized robot prototype equipped with pressure sensors and accelerometer. We found overall correct classification (Random Forests) within the 2-D grid of 36% (all participants combined) and 48% (average of participants classified individually); chance 11%. Rates rose to 56% in the high arousal zone. To better understand classifier performance, we defined and analyzed new metrics that better indicate closeness of the gestural expressions. We also present a method to combine direct affect recognition with affect inferred from gesture recognition. This analysis provides a unique first insight into the nature and quality of affective touch, with implications as a design tool and for incorporating unintrusive affect sensing into deployed interactions. (C) 2014 Elsevier B.V. All rights reserved.