Teacher, trainee, and student based knowledge distillation technique for monitoring indoor activities: Poster abstract
| dc.contributor.author | Mishra R.; Gupta H.P.; Dutta T. | |
| dc.date.accessioned | 2025-05-23T11:30:05Z | |
| dc.description.abstract | Recent years have witnessed unprecedented growth in sensors-based indoor activity recognition. Further, a significant improvement in recognition performance of indoor activities is observed by incorporating Deep Neural Network (DNN) model. In this paper, we propose knowledge distillation based economic and efficient indoor activity recognition approach for low-cost resource constraint devices. Here, we adopt knowledge from teacher and trainee (cumbersome DNN models) for training student (compressed DNN model). Initially, student and trainee both are beginner and trainee helps the student in learning from the teacher. The student, after certain steps, is mature enough for directly learning from the teacher. We introduce an early halting mechanism for simultaneously reducing floating-point operations and training time of the student model. © 2020 ACM. | |
| dc.identifier.doi | https://doi.org/10.1145/3384419.3430450 | |
| dc.identifier.uri | http://172.23.0.11:4000/handle/123456789/11770 | |
| dc.relation.ispartofseries | SenSys 2020 - Proceedings of the 2020 18th ACM Conference on Embedded Networked Sensor Systems | |
| dc.title | Teacher, trainee, and student based knowledge distillation technique for monitoring indoor activities: Poster abstract |