TS-ACL: A Time Series Analytic Continual Learning Framework for Privacy-Preserving and Class-Incremental Pattern Recognition

dc.contributor.authorFan, Kejia
dc.contributor.authorLi, Jiaxu
dc.contributor.authorLai, Songning
dc.contributor.authorLv, Linpu
dc.contributor.authorLiu, Anfeng
dc.contributor.authorTang, Jianheng
dc.contributor.authorSong, Houbing
dc.contributor.authorZhuang, Huiping
dc.date.accessioned2024-12-11T17:02:31Z
dc.date.available2024-12-11T17:02:31Z
dc.date.issued2024-10-21
dc.description.abstractClass-incremental Learning (CIL) in Time Series Classification (TSC) aims to incrementally train models using the streaming time series data that arrives continuously. The main problem in this scenario is catastrophic forgetting, i.e., training models with new samples inevitably leads to the forgetting of previously learned knowledge. Among existing methods, the replay-based methods achieve satisfactory performance but compromise privacy, while exemplar-free methods protect privacy but suffer from low accuracy. However, more critically, owing to their reliance on gradient-based update techniques, these existing methods fundamentally cannot solve the catastrophic forgetting problem. In TSC scenarios with continuously arriving data and temporally shifting distributions, these methods become even less practical. In this paper, we propose a Time Series Analytic Continual Learning framework, called TS-ACL. Inspired by analytical learning, TS-ACL transforms neural network updates into gradient-free linear regression problems, thereby fundamentally mitigating catastrophic forgetting. Specifically, employing a pre-trained and frozen feature extraction encoder, TS-ACL only needs to update its analytic classifier recursively in a lightweight manner that is highly suitable for real-time applications and large-scale data processing. Additionally, we theoretically demonstrate that the model obtained recursively through the TS-ACL is exactly equivalent to a model trained on the complete dataset in a centralized manner, thereby establishing the property of absolute knowledge memory. Extensive experiments validate the superior performance of our TS-ACL.
dc.description.urihttp://arxiv.org/abs/2410.15954
dc.format.extent12 pages
dc.genrejournal articles
dc.genrepreprints
dc.identifierdoi:10.13016/m2swkd-cfl1
dc.identifier.urihttps://doi.org/10.48550/arXiv.2410.15954
dc.identifier.urihttp://hdl.handle.net/11603/37078
dc.language.isoen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Information Systems Department
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.subjectComputer Science - Machine Learning
dc.subjectComputer Science - Artificial Intelligence
dc.subjectUMBC Security and Optimization for Networked Globe Laboratory (SONG Lab)
dc.titleTS-ACL: A Time Series Analytic Continual Learning Framework for Privacy-Preserving and Class-Incremental Pattern Recognition
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0003-2631-9223

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2410.15954v2.pdf
Size:
1.86 MB
Format:
Adobe Portable Document Format