Development of a Gesture Recognition Algorithm for Therapeutic Holding Robot

  • 陳 瑋泰

Student thesis: Doctoral Thesis

Abstract

According to the World Health Organization’s report the percentage of older adults has significantly increased in recent years Maintaining their physical and mental health is essential for future policy planning Recently many studies have found that companion robots had beneficial effects on physical and mental health for older adults This is explained by the fact that robots can offer various responses according to the stimulation from humans Thus recognition of these stimulations is the first step towards human–robot interaction An example is a tactile interaction which is the preferred channel to communicate intimate emotions Therefore this study focused on the hand gesture recognition of social touch in humans using machine learning and deep learning Machine learning and deep learning powerful tools for classification have been widely developed to recognize the types of social touch gestures In this study five algorithms support vector machines (SVM) random forest (RF) and three convolutional neural networks one-dimensional (1D-CNN) two-dimensional (2D-CNN) and three-dimensional (3D-CNN) were used for analysis and comparison of their performance on hand gesture recognition These models recognized six types of social touch gestures the pat stroke grab poke scratch and no touch The dataset included 17 716 samples of these six gestures All gestures were performed in two postures stationary and holding on a pressure mapping sensor mat attached on a cylinder-shaped companion robot simulator Ten-fold cross-validation was used to evaluate the performance of all models The final accuracy percentages for SVM RF 1D-CNN 2D-CNN and 3D-CNN were 16 76% 49 37% 70 51% 70 46% and 75 78% respectively The results indicate that the models could classify hand gestures based on pressure data Future work is required to increase the accuracy either by adding database size or utilizing high-resolution pressure sensors Furthermore the relationships between hand gestures and emotion states should also be considered
Date of Award2020
Original languageEnglish
SupervisorFong-chin Su (Supervisor)

Cite this

'