TagTeam: Towards Wearable-Assisted, Implicit Guidance for Human–Drone Teams

Author/Creator ORCID

Date

2022-08-10

Department

Program

Citation of Original Publication

Jayarajah, Kasthuri, Aryya Gangopadhyay, and Nicholas Waytowich. “TagTeam: Towards Wearable-Assisted, Implicit Guidance for Human-Drone Teams.” In Proceedings of the 1st ACM Workshop on Smart Wearable Systems and Applications, 13–18. SmartWear ’22. New York, NY, USA: Association for Computing Machinery, 2022. https://doi.org/10.1145/3556560.3560715.

Rights

This work was written as part of one of the author's official duties as an Employee of the United States Government and is therefore a work of the United States Government. In accordance with 17 U.S.C. 105, no copyright protection is available for such works under U.S. Law.
Public Domain Mark 1.0

Subjects

Abstract

The availability of sensor-rich smart wearables and tiny, yet capable, unmanned vehicles such as nano quadcopters, opens up opportunities for a novel class of highly interactive, attention-shared human–machine teams. Reliable, lightweight, yet passive exchange of intent, data and inferences within such human–machine teams make them suitable for scenarios such as search-and-rescue with significantly improved performance in terms of speed, accuracy and semantic awareness. In this paper, we articulate a vision for such human–drone teams and key technical capabilities such teams must encompass. We present TagTeam, an early prototype of such a team and share promising demonstration of a key capability (i.e., motion awareness).