LASense: Pushing the Limits of Fine-grained Activity Sensing Using Acoustic Signals
Links to Files
Permanent Link
Author/Creator
Author/Creator ORCID
Date
Type of Work
Department
Program
Citation of Original Publication
Li, Dong, Jialin Liu, Sunghoon Ivan Lee, and Jie Xiong. “LASense: Pushing the Limits of Fine-Grained Activity Sensing Using Acoustic Signals.” Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6, no. 1 (2022): 21:1-21:27. https://doi.org/10.1145/3517253.
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
Acoustic signals have been widely adopted in sensing fine-grained human activities, including respiration monitoring, finger tracking, eye blink detection, etc. One major challenge for acoustic sensing is the extremely limited sensing range, which becomes even more severe when sensing fine-grained activities. Different from the prior efforts that adopt multiple microphones and/or advanced deep learning techniques for long sensing range, we propose a system called LASense, which can significantly increase the sensing range for fine-grained human activities using a single pair of speaker and microphone. To achieve this, LASense introduces a virtual transceiver idea that purely leverages delicate signal processing techniques in software. To demonstrate the effectiveness of LASense, we apply the proposed approach to three fine-grained human activities, i.e., respiration, finger tapping and eye blink. For respiration monitoring, we significantly increase the sensing range from the state-of-the-art 2 m to 6 m. For finer-grained finger tapping and eye blink detection, we increase the state-of-the-art sensing range by 150% and 80%, respectively.
