A network resource aware federated learning approach using knowledge distillation
| dc.contributor.author | Mishra R.; Gupta H.P.; Dutta T. | |
| dc.date.accessioned | 2025-05-23T11:26:26Z | |
| dc.description.abstract | Federated Learning (FL) has gained unprecedented growth in the past few years by facilitating data privacy. This poster proposes a network resource aware federated learning approach that utilizes the concept of knowledge distillation to train a machine learning model by using local data samples. The approach creates different groups based on the bandwidth between clients and server and iteratively applies FL to each group by compressing the model using knowledge distillation. The approach reduces the bandwidth requirement and generates a more robust model trained on the data of all clients without revealing privacy. © 2021 IEEE. | |
| dc.identifier.doi | https://doi.org/10.1109/INFOCOMWKSHPS51825.2021.9484597 | |
| dc.identifier.uri | http://172.23.0.11:4000/handle/123456789/10297 | |
| dc.relation.ispartofseries | IEEE INFOCOM 2021 - IEEE Conference on Computer Communications Workshops, INFOCOM WKSHPS 2021 | |
| dc.title | A network resource aware federated learning approach using knowledge distillation |