Long-Tailed Federated Learning in Internet of Medical Things Based on Ensemble Distillation and Imbalanced Calibration

dc.contributor.authorJiang, Bin
dc.contributor.authorShang, Yuchen
dc.contributor.authorYue, Guanghui
dc.contributor.authorWang, Huihui Helen
dc.contributor.authorSong, Houbing
dc.date.accessioned2025-03-11T14:42:49Z
dc.date.available2025-03-11T14:42:49Z
dc.date.issued2025-01-31
dc.description.abstractThe Internet of Medical Things (IoMT) has a promising future, as its devices can monitor vital signs, offer treatment guidance, and perform real-time diagnostics using AI and wireless communication technologies. However, due to the difficulty of collecting patient data on a large scale and potential privacy risks, traditional centralized machine learning methods are often challenging to apply in IoMT devices. Federated learning, as a privacy-preserving technology, aims to build high-quality deep learning models across distributed clients while protecting data privacy. However, current popular federated learning methods exhibit suboptimal performance when dealing with non-IIDness data, especially in the case of long-tail class distributions, leading to unsatisfactory results. Additionally, due to privacy constraints on distributed clients, these methods cannot leverage traditional deep learning techniques to handle long-tail data, which is often characterized by long-tail heterogeneous distributions in IoMT. To address these challenges, this paper proposes a solution of Privacy-preserving Computing Client Scoring and Knowledge Distillation (FedLT+SKD). The method uses privacy protection computation to provide prior knowledge of global data class distribution while ensuring data privacy. Based on this prior knowledge, it employs a points-based sampling strategy to identify clients that perform well on long tail data and uploads their local model to the server. On the server side, the robustness of the global model is enhanced by collection distillation and imbalance correction. We verify the effectiveness of this method on the medical datasets ISIC, ChestX-ray14, MRI and also on the traditional datasets CIFAR-10-LT and CIFAR-100-LT, and the experimental results show that the method is superior to the popular federation and long-tail learning methods.
dc.description.sponsorshipThis work was supported in part by National Natural Science Foundation of China under Grant 62102264, Taishan Scholar Project under Grant tsqnz20230602, Natural Science Foundation of Shandong Province under Grant ZR2024MF115 and ZR2023LZH010, and Youth Innovation University Team Project in Shandong under Grant 2022KJ062.
dc.description.urihttps://ieeexplore.ieee.org/abstract/document/10869336/
dc.format.extent12 pages
dc.genrejournal articles
dc.genrepostprints
dc.identifierdoi:10.13016/m2fu3y-jnaw
dc.identifier.citationJiang, Bin, Yuchen Shang, Guanghui Yue, Huihui Helen Wang, and Houbing Herbert Song. "Long-Tailed Federated Learning in Internet of Medical Things Based on Ensemble Distillation and Imbalanced Calibration". IEEE Transactions on Consumer Electronics, 2025, 1–1. https://doi.org/10.1109/TCE.2025.3537062.
dc.identifier.urihttp://doi.org/10.1109/TCE.2025.3537062
dc.identifier.urihttp://hdl.handle.net/11603/37777
dc.language.isoen_US
dc.publisherIEEE
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Information Systems Department
dc.rights© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
dc.subjectPrivacy computing
dc.subjectTail
dc.subjectTraining
dc.subjectDistributed databases
dc.subjectServers
dc.subjectAnalytical models
dc.subjectFederated learning
dc.subjectClient scoring
dc.subjectUMBC Security and Optimization for Networked Globe Laboratory (SONG Lab)
dc.subjectComputational modeling
dc.subjectEnsemble distillation
dc.subjectInternet of Medical Things
dc.subjectData privacy
dc.subjectLong-tailed data
dc.subjectHeavily-tailed distribution
dc.subjectData models
dc.titleLong-Tailed Federated Learning in Internet of Medical Things Based on Ensemble Distillation and Imbalanced Calibration
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0003-2631-9223

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
LongTailed_Federated_Learning_in_Internet_of_Medical_Things_Based_on_Ensemble_Distillation_and_Imbalanced_Calibration.pdf
Size:
4.18 MB
Format:
Adobe Portable Document Format