Federated Learning for Internet of Underwater Things Based on Lightweight Distillation and Data Refinement
Files
Links to Files
Author/Creator
Author/Creator ORCID
Date
Type of Work
Department
Program
Citation of Original Publication
Jiang, Bin, Jiacong Fei, Fei Luo, Yongxin Liu, and Houbing Herbert Song. “Federated Learning for Internet of Underwater Things Based on Lightweight Distillation and Data Refinement.” IEEE Internet of Things Journal, September 2, 2025, 1–1. https://doi.org/10.1109/JIOT.2025.3605230.
Rights
© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Subjects
Data integrity
UMBC Security and Optimization for Networked Globe Laboratory (SONG Lab)
Training
Federated learning
dual-path collaborative optimization
Underwater acoustics
Image enhancement
Bandwidth
knowledge distillation
Internet of Things
Servers
data quality enhancement
Underwater federated learning
Data models
Collaboration
UMBC Security and Optimization for Networked Globe Laboratory (SONG Lab)
Training
Federated learning
dual-path collaborative optimization
Underwater acoustics
Image enhancement
Bandwidth
knowledge distillation
Internet of Things
Servers
data quality enhancement
Underwater federated learning
Data models
Collaboration
Abstract
Underwater federated learning (UFL) is an emerging technology to realize distributed intelligent collaboration in the Internet of Underwater Things (IoUT), but its application faces two challenges: the limited bandwidth of underwater communication leads to low model transmission efficiency, and the data is characterized by low quality and high heterogeneity due to environmental interference. In this paper, an underwater federated learning framework with dual-path collaborative optimization is proposed to solve the above problems systematically through the joint design of knowledge distillation and data quality enhancement. Specifically, to optimize the transmission efficiency, a knowledge distillation mechanism is designed, and the complex model is compressed into a simplified model suitable for low-bandwidth transmission by using the collaborative distillation of lightweight teacher-student models. To enhance data quality, a supervised data quality enhancement (S-DQE) method is proposed. The integration of traditional methods with deep learning-based approaches optimizes feature representation through the joint application of contrastive learning and adversarial training, thereby effectively addressing the issue of low-quality underwater data. Finally, numerical results are given to compare the final scheme with the initial federated learning scheme, lightweight model scheme, and lightweight-data quality enhancement scheme, clearly demonstrating its performance gains.
