Unseen Activity Recognitions: A Hierarchical Active Transfer Learning Approach
Links to Fileshttps://ieeexplore.ieee.org/document/7979989/
MetadataShow full item record
Type of Work11 PAGES
conference papers and proceedings preprints
Citation of Original PublicationM. A. U. Alam and N. Roy, "Unseen Activity Recognitions: A Hierarchical Active Transfer Learning Approach," 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS), Atlanta, GA, 2017, pp. 436-446.
RightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please contact the author.
Mobile Pervasive & Sensor Computing Lab
Human activity recognition (AR) is an essential element for user-centric and context-aware applications. While previous studies showed promising results using various machine learning algorithms, most of them can only recognize the activities that were previously seen in the training data. We investigate the challenges of improving the recognition of unseen daily activities in smart home environment, by better exploiting the hierarchical taxonomy of complex daily activities. We first (a) design a hierarchical representation of complex activity taxonomy in terms of human-readable semantic attributes, and (b) develop a hierarchy of classifiers which incorporates a cluster tree built on the domain knowledge from training samples. Though this model is rich in recognizing complex activities that are previously seen in training data, it is not well versed to recognize unseen complex activities without new training samples. To tackle this challenge, we extend Hierarchical Active Transfer Learning (HATL) approach that exploits semantic attribute cluster structure of complex activities shared between seen (source) and unseen (target) activity domains. Our approach employs transfer and active learning to help label target domain unlabeled data by spawning the most effective queries. We evaluated our approach with two real-time smart home systems (IRB #HP-00064387) which corroborates radical improvements in recognizing unseen complex activities.