Yazar "Al-Azawi, Saad" seçeneğine göre listele
Listeleniyor 1 - 2 / 2
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe A systematic mapping study on touch classification(Int Journal Computer Science & Network Security-Ijcsns, 2018) Fleh, Saad Q.; Bayat, Oğuz; Al-Azawi, Saad; Uçan, Osman NuriOne of the basic interpersonal methods to communicate emotions is through touch. Social touch classification is one of the leading research which has great potential for more improvement. Social touch classification can be beneficial in the much scientific application such as robotics, human-robot interaction, etc.. Each person has the ability to interact with the environment and with other people via touch sensors that are speared over human soma. These touch sensors provide us with the important information about objects such as size, shape, position, surface and their movement. Therefore, the touch system plays the main role in human life from early days. The small gesture can express strong emotion, from the comforting experience of being touched by one's spouse, to the discomfort caused by a touch from a stranger. This paper presents and explains a systematic mapping study on social touch gesture recognition. From various digital libraries, 938 papers in total are collected. After applying three filters, 49 papers as primary studies related to the main topic are selected as listed in Appendix (A). The selected papers classified with respect to several facets. The results provide an overview of the existing relevant studies that are reported in the literature, highlight the focused areas and research gaps.Öğe Social touch gesture recognition using convolutional neural network(Hindawi Ltd, 2018) Albawi, Saad; Bayat, Oğuz; Al-Azawi, Saad; Uçan, Osman NuriRecently, social touch gesture recognition has been considered an important topic for touch modality, which can lead to highly efficient and realistic human-robot interaction. In this paper, a deep convolutional neural network is selected to implement a social touch recognition system for raw input samples (sensor data) only. The touch gesture recognition is performed using a dataset previously measured with numerous subjects that perform varying social gestures. This dataset is dubbed as the corpus of social touch, where touch was performed on a mannequin arm. A leave-one-subject-out cross-validation method is used to evaluate system performance. The proposed method can recognize gestures in nearly real time after acquiring a minimum number of frames (the average range of frame length was from 0.2% to 4.19% from the original frame lengths) with a classification accuracy of 63.7%. The achieved classification accuracy is competitive in terms of the performance of existing algorithms. Furthermore, the proposed system outperforms other classification algorithms in terms of classification ratio and touch recognition time without data preprocessing for the same dataset.