Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Poster Abstract: Efficient Knowledge Distillation to Train Lightweight Neural Network for Heterogeneous Edge Devices

dc.contributor.authorKumari P.; Gupta H.P.; Sikdar B.
dc.date.accessioned2025-05-23T11:16:42Z
dc.description.abstractThis poster presents a novel approach that harnesses large-sized deep neural networks to craft lightweight variants, addressing constraints in storage, processing speed, and task execution time on heterogeneous edge devices. Knowledge distillation is employed to refine the training of lightweight deep neural networks, and a novel early termination technique is introduced to optimize resource utilization and expedite the training process. This approach yields satisfactory accuracy while accommodating diverse heterogeneous edge device constraints. © 2023 Copyright is held by the owner/author(s). Publication rights licensed to ACM.
dc.identifier.doihttps://doi.org/10.1145/3625687.3628409
dc.identifier.urihttp://172.23.0.11:4000/handle/123456789/6569
dc.relation.ispartofseriesSenSys 2023 - Proceedings of the 21st ACM Conference on Embedded Networked Sensors Systems
dc.titlePoster Abstract: Efficient Knowledge Distillation to Train Lightweight Neural Network for Heterogeneous Edge Devices

Files

Collections