A Federated Learning Approach to Minimize Communication Rounds Using Noise Rectification
| dc.contributor.author | Mishra R.; Gupta H.P. | |
| dc.date.accessioned | 2025-05-23T11:12:56Z | |
| dc.description.abstract | Federated learning is a distributed training framework that ensures data privacy and reduces communication overhead to train a shared model among multiple participants. Noise in the datasets and communication channels diminishes performance and increases communication rounds for convergence. This paper proposes a federated learning approach to rectify noise in local datasets and communication channels for reducing communication rounds. We employ two filters: one to rectify noise in the dataset and the other for the communication channel. We also consider two types of participants: one uses a filter on the dataset, and the other uses both the dataset and the communication channel. We first derive the expression to estimate the communication rounds for convergence. Next, we determine the number of participants under each type. Finally, we perform the experiments to verify the effectiveness of the proposed work on existing datasets and compare them with state-of-the-art techniques. © 2024 IEEE. | |
| dc.identifier.doi | https://doi.org/10.1109/WCNC57260.2024.10570893 | |
| dc.identifier.uri | http://172.23.0.11:4000/handle/123456789/5294 | |
| dc.relation.ispartofseries | IEEE Wireless Communications and Networking Conference, WCNC | |
| dc.title | A Federated Learning Approach to Minimize Communication Rounds Using Noise Rectification |