Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Poster Abstract: Efficient Knowledge Distillation to Train Lightweight Neural Network for Heterogeneous Edge Devices

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This poster presents a novel approach that harnesses large-sized deep neural networks to craft lightweight variants, addressing constraints in storage, processing speed, and task execution time on heterogeneous edge devices. Knowledge distillation is employed to refine the training of lightweight deep neural networks, and a novel early termination technique is introduced to optimize resource utilization and expedite the training process. This approach yields satisfactory accuracy while accommodating diverse heterogeneous edge device constraints. © 2023 Copyright is held by the owner/author(s). Publication rights licensed to ACM.

Description

Keywords

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By