LASO: Exploiting Locomotive and Acoustic Signatures over the Edge to Annotate IMU Data for Human Activity Recognition

dc.contributor.authorChatterjee, Soumyajit
dc.contributor.authorChakma, Avijoy
dc.contributor.authorGangopadhyay, Aryya
dc.contributor.authorRoy, Nirmalya
dc.contributor.authorMitra, Bivas
dc.contributor.authorChakraborty, Sandip
dc.date.accessioned2020-11-20T16:29:37Z
dc.date.available2020-11-20T16:29:37Z
dc.descriptionICMI '20: Proceedings of the 2020 International Conference on Multimodal Interaction, October 2020
dc.description.abstractAnnotated IMU sensor data from smart devices and wearables are essential for developing supervised models for fine-grained human activity recognition, albeit generating sufficient annotated data for diverse human activities under different environments is challenging. Existing approaches primarily use human-in-the-loop based techniques, including active learning; however, they are tedious, costly, and time-consuming. Leveraging the availability of acoustic data from embedded microphones over the data collection devices, in this paper, we propose LASO, a multimodal approach for automated data annotation from acoustic and locomotive information. LASO works over the edge device itself, ensuring that only the annotated IMU data is collected, discarding the acoustic data from the device itself, hence preserving the audio-privacy of the user. In the absence of any pre-existing labeling information, such an auto-annotation is challenging as the IMU data needs to be sessionized for different time-scaled activities in a completely unsupervised manner. We use a change-point detection technique while synchronizing the locomotive information from the IMU data with the acoustic data, and then use pre-trained audio-based activity recognition models for labeling the IMU data while handling the acoustic noises. LASO efficiently annotates IMU data, without any explicit human intervention, with a mean accuracy of $0.93$ ($\pm 0.04$) and $0.78$ ($\pm 0.05$) for two different real-life datasets from workshop and kitchen environments, respectively.en_US
dc.description.sponsorshipThis work has been partially supported by the MHRD funded SPARC collaborative project ‘SFE_SKI-1220’ along with the partial support from SERB Early Career Research Award ECR/2017/000121 (18/07/2017), funded by Department of Science and Technology, Government of India. N. Roy, A. Chakma, and A. Gangopadhyay acknowledge NSF CAREER Award # 1750936, ONR under grant N00014-18-1-2462, and Alzheimer’s Association, Grant/Award #AARG-17-533039.en_US
dc.description.urihttps://dl.acm.org/doi/abs/10.1145/3382507.3418826en_US
dc.format.extent10 pagesen_US
dc.genreconference papers and proceedingsen_US
dc.identifierdoi:10.13016/m2bgju-5t3n
dc.identifier.citationSoumyajit Chatterjee, Avijoy Chakma, Aryya Gangopadhyay, Nirmalya Roy , Bivas Mitra and Sandip Chakraborty, LASO: Exploiting Locomotive and Acoustic Signatures over the Edge to Annotate IMU Data for Human Activity Recognition, ICMI '20: Proceedings of the 2020 International Conference on Multimodal Interaction, Pages 333–342, https://doi.org/10.1145/3382507.3418826en_US
dc.identifier.urihttps://doi.org/10.1145/3382507.3418826
dc.identifier.urihttp://hdl.handle.net/11603/20113
dc.language.isoen_USen_US
dc.publisherACMen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Information Systems Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.subjectlocomotive
dc.subjectsmart devices
dc.subjecthuman activity recognition
dc.subjectLASO
dc.titleLASO: Exploiting Locomotive and Acoustic Signatures over the Edge to Annotate IMU Data for Human Activity Recognitionen_US
dc.typeTexten_US

Files

License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: