Emerging Frontiers in Human–Robot Interaction

dc.contributor.authorSafavi, Farshad
dc.contributor.authorOlikkal, Parthan
dc.contributor.authorPei, Dingyi
dc.contributor.authorKamal, Sadia
dc.contributor.authorMeyerson, Helen
dc.contributor.authorPenumalee, Varsha
dc.contributor.authorVinjamuri, Ramana
dc.date.accessioned2024-04-10T19:05:45Z
dc.date.available2024-04-10T19:05:45Z
dc.date.issued2024-03-18
dc.description.abstractEffective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI.
dc.description.sponsorshipResearch supported by National Science Foundation (CAREER Award HCC-2053498).
dc.description.urihttps://link.springer.com/article/10.1007/s10846-024-02074-7
dc.format.extent26 pages
dc.genrejournal articles
dc.identifierdoi:10.13016/m2ikzp-fn8f
dc.identifier.citationSafavi, Farshad, Parthan Olikkal, Dingyi Pei, Sadia Kamal, Helen Meyerson, Varsha Penumalee, and Ramana Vinjamuri. “Emerging Frontiers in Human–Robot Interaction.” Journal of Intelligent & Robotic Systems 110, no. 2 (March 18, 2024): 45. https://doi.org/10.1007/s10846-024-02074-7.
dc.identifier.urihttps://doi.org/10.1007/s10846-024-02074-7
dc.identifier.urihttp://hdl.handle.net/11603/32988
dc.language.isoen_US
dc.publisherSpringer
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.rightsCC BY 4.0 DEED Attribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectBrain-Computer Interface
dc.subjectComputer Vision
dc.subjectEmotional Intelligent Perception
dc.subjectHuman–Robot Collaboration
dc.subjectHuman–Robot Interaction
dc.titleEmerging Frontiers in Human–Robot Interaction
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0002-5513-1150
dcterms.creatorhttps://orcid.org/0000-0001-7756-3678
dcterms.creatorhttps://orcid.org/0000-0003-1650-5524

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
s10846-024-02074-7.pdf
Size:
796.43 KB
Format:
Adobe Portable Document Format