Biomimetic learning of hand gestures in a humanoid robot

dc.contributor.authorOlikkal, Parthan Sathishkumar
dc.contributor.authorPei, Dingyi
dc.contributor.authorKarri, Bharat Kashyap
dc.contributor.authorSatyanarayana, Ashwin
dc.contributor.authorKakoty, Nayan M.
dc.contributor.authorVinjamuri, Ramana
dc.date.accessioned2024-08-07T14:07:58Z
dc.date.available2024-08-07T14:07:58Z
dc.date.issued2024-07-18
dc.description.abstractHand gestures are a natural and intuitive form of communication, and integrating this communication method into robotic systems presents significant potential to improve human-robot collaboration. Recent advances in motor neuroscience have focused on replicating human hand movements from synergies also known as movement primitives. Synergies, fundamental building blocks of movement, serve as a potential strategy adapted by the central nervous system to generate and control movements. Identifying how synergies contribute to movement can help in dexterous control of robotics, exoskeletons, prosthetics and extend its applications to rehabilitation. In this paper, 33 static hand gestures were recorded through a single RGB camera and identified in real-time through the MediaPipe framework as participants made various postures with their dominant hand. Assuming an open palm as initial posture, uniform joint angular velocities were obtained from all these gestures. By applying a dimensionality reduction method, kinematic synergies were obtained from these joint angular velocities. Kinematic synergies that explain 98% of variance of movements were utilized to reconstruct new hand gestures using convex optimization. Reconstructed hand gestures and selected kinematic synergies were translated onto a humanoid robot, Mitra, in real-time, as the participants demonstrated various hand gestures. The results showed that by using only few kinematic synergies it is possible to generate various hand gestures, with 95.7% accuracy. Furthermore, utilizing low-dimensional synergies in control of high dimensional end effectors holds promise to enable near-natural human-robot collaboration.
dc.description.sponsorshipThe author(s) declare financial support was received for the research, authorship, and/or publication of this article. This research was funded by National Science Foundation (NSF) CAREER Award, grant number HCC-2053498.
dc.description.urihttps://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2024.1391531/full
dc.format.extent14 pages
dc.genrejournal articles
dc.identifierdoi:10.13016/m2y849-txkv
dc.identifier.citationOlikkal, Parthan, Dingyi Pei, Bharat Kashyap Karri, Ashwin Satyanarayana, Nayan M. Kakoty, and Ramana Vinjamuri. “Biomimetic Learning of Hand Gestures in a Humanoid Robot.” Frontiers in Human Neuroscience 18 (July 19, 2024). https://doi.org/10.3389/fnhum.2024.1391531.
dc.identifier.urihttps://doi.org/10.3389/fnhum.2024.1391531
dc.identifier.urihttp://hdl.handle.net/11603/35257
dc.language.isoen
dc.publisherFrontiers
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Student Collection
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.relation.ispartofUMBC Faculty Collection
dc.rightsCC BY 4.0 DEED Attribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectbiomimetic robots
dc.subjecthand gestures
dc.subjectsign language recognition
dc.subjectKinematic Synergies
dc.subjectHand Kinematics
dc.subjectBioinspired Robots
dc.subjectmediapipe
dc.subjecthuman robot interaction
dc.titleBiomimetic learning of hand gestures in a humanoid robot
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0002-5513-1150
dcterms.creatorhttps://orcid.org/0000-0001-7756-3678
dcterms.creatorhttps://orcid.org/0000-0003-1650-5524

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
fnhum181391531.pdf
Size:
2.67 MB
Format:
Adobe Portable Document Format