Reconstructing hand gestures with synergies extracted from dance movements

Department

Program

Citation of Original Publication

Olikkal, Parthan, Chris Dollo, Akshara Ajendla, Ann Sofie Clemmensen, and Ramana Vinjamuri. “Reconstructing Hand Gestures with Synergies Extracted from Dance Movements.” Scientific Reports 15, no. 1 (2025): 41670. https://doi.org/10.1038/s41598-025-25563-7.

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International

Abstract

Comprehending and replicating human hand gestures is crucial for advancements in robotics, sign language interpretation and human-computer interactions. While extensive research has focused on improving hand gesture recognition, the therapeutic benefits of dance movements have often been overlooked. This study introduces a novel approach to understanding hand gestures through structured movement primitives derived from hand gestures (mudras) used in Indian classical dance form known as Bharatanatyam. Hand gesture synergies were extracted using Gaussian-modeled joint angular velocities and represented as fundamental syllables of motion. These syllables were then employed to reconstruct 75 diverse hand gestures, including American Sign Language (ASL) postures, a dataset of natural hand grasps and traditional mudras. Comparative analysis between mudra-derived synergies achieved superior reconstruction accuracy (95.78% for natural grasps and 92.99% for mudras) compared to synergies derived from natural grasps (88.92% for natural grasps and 82.51% for mudras). The results suggest that the structured and intentional nature of Bharatanatyam mudras leads to much stronger representation of syllables of movements that have superior generalizability and precision. Additionally, the reconstructed gestures were successfully mapped onto Mitra, a humanoid robot with five degree of freedom hand using a continuous joint-mapping approach. This research highlights the potential of dance inspired structured learning in enhancing dexterity, rehabilitation, and motor control, paving the way for more efficient gesture-based interaction models in robotics, prosthetics and rehabilitation.