RadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar

dc.contributor.authorMiller, Elishiah
dc.contributor.authorLi, Zheng
dc.contributor.authorMentis, Helena
dc.contributor.authorPark, Adrian
dc.contributor.authorZhu, Ting
dc.contributor.authorBanerjee, Nilanjan
dc.date.accessioned2020-11-25T18:33:49Z
dc.date.available2020-11-25T18:33:49Z
dc.date.issued2019-11-15
dc.description.abstractIn this paper, we show how surgeons can interact with medical images using finger and hand gestures in two situations: one hand-free and no hands-free interaction. We explain how interaction with only one hand or a couple of fingers is beneficial and can help surgeons have continuous interaction, without the need to release their tools and leave the operating table, saving valuable patient time. To this end, we present RadSense, an end-to-end and unobtrusive system that uses Doppler radar-sensing to recognize hand and finger gestures when either one or both hands are busy. Our system permits the following important capabilities: (1) touch-less input for sterile interaction with connected health applications, (2) hand and finger gesture recognition when either one or both hands are busy holding tools, extending multitasking capabilities for health professionals, and (3) mobile and networked, allowing for custom wearable and non-wearable configurations. We evaluated our system in a simulated operating room to manipulate preoperative images using four gestures: circle, double tap, swipe, and finger click. We collected data from five subjects and trained a K-Nearest-Neighbor multi-class classifier using 15-fold cross validation, achieving a 94.5% precision for gesture classification. We conclude that our system performs with high accuracy and is useful in cases where only one hand or a few fingers are free to interact when the hands are busy.en_US
dc.description.sponsorshipThis work is supported in part by National Science Foundation (NSF), USA, grants CNS-1539047 and CNS-1652669. This work is alsosupported in part by the James and Sylvia Earl Simulation to Advance Innovation and Learning Center (SAIL). We acknowledge RashmiPrava Patro for helping with data collection.en_US
dc.description.urihttps://www.sciencedirect.com/science/article/pii/S2352648319300534?via%3Dihuben_US
dc.format.extent12 pagesen_US
dc.genrejournal articlesen_US
dc.identifierdoi:10.13016/m2qkg7-jxvv
dc.identifier.citationElishiah Miller, Zheng Li, Helena Mentis, Adrian Park, Ting Zhu, Nilanjan Banerjee, RadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar, Smart Health, Volume 15, 2020, DOI: https://doi.org/10.1016/j.smhl.2019.100089.en_US
dc.identifier.urihttps://doi.org/10.1016/j.smhl.2019.100089.
dc.identifier.urihttp://hdl.handle.net/11603/20147
dc.language.isoen_USen_US
dc.publisherElsevieren_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.relation.ispartofUMBC Information Systems Department
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.subjecthealthcareen_US
dc.subjecthuman centered computingen_US
dc.subjectwearable devicesen_US
dc.subjectgesture recognitionen_US
dc.subjectbusy hand interactionen_US
dc.titleRadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radaren_US
dc.typeTexten_US

Files

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: