Kinematic-based classification of social gestures and grasping by humans and machine learning techniquesShow others and affiliations
2021 (English)In: Frontiers in Robotics and AI, E-ISSN 2296-9144, Vol. 8, no 308, p. 1-17, article id 699505Article in journal (Refereed) Published
Abstract [en]
The affective motion of humans conveys messages that other humans perceive and understand without conventional linguistic processing. This ability to classify human movement into meaningful gestures or segments plays also a critical role in creating social interaction between humans and robots. In the research presented here, grasping and social gesture recognition by humans and four machine learning techniques (k-Nearest Neighbor, Locality-Sensitive Hashing Forest, Random Forest and Support Vector Machine) is assessed by using human classification data as a reference for evaluating the classification performance of machine learning techniques for thirty hand/arm gestures. The gestures are rated according to the extent of grasping motion on one task and the extent to which the same gestures are perceived as social according to another task. The results indicate that humans clearly rate differently according to the two different tasks. The machine learning techniques provide a similar classification of the actions according to grasping kinematics and social quality. Furthermore, there is a strong association between gesture kinematics and judgments of grasping and the social quality of the hand/arm gestures. Our results support previous research on intention-from-movement understanding that demonstrates the reliance on kinematic information for perceiving the social aspects and intentions in different grasping actions as well as communicative point-light actions.
Place, publisher, year, edition, pages
Frontiers Media S.A., 2021. Vol. 8, no 308, p. 1-17, article id 699505
Keywords [en]
gesture recognition, social gestures, machine learning, Biological motion, kinematics, social signal processing
National Category
Human Computer Interaction Robotics
Research subject
Interaction Lab (ILAB)
Identifiers
URN: urn:nbn:se:his:diva-20560DOI: 10.3389/frobt.2021.699505ISI: 000716638700001PubMedID: 34746242Scopus ID: 2-s2.0-85118674941OAI: oai:DiVA.org:his-20560DiVA, id: diva2:1593626
Note
CC BY 4.0
Correspondence: Dr. Paul Hemeren, University of Skövde, Skövde, Sweden, paul.hemeren@his.se
This article is part of the Research Topic Affective Shared Perception
published: 15 October 2021
2021-09-132021-09-132022-09-02