Layer-Level Knowledge Distillation for Deep Neural Network Learning

Motivated by the recently developed distillation approaches that aim to obtain small and fast-to-execute models, in this paper a novel Layer Selectivity Learning (LSL) framework is proposed for learning deep models. We firstly use an asymmetric dual-model learning framework, called Auxiliary Structu...

Full description

Bibliographic Details
Main Authors: Hao-Ting Li, Shih-Chieh Lin, Cheng-Yeh Chen, Chen-Kuo Chiang
Format: Article
Language:English
Published: MDPI AG 2019-05-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/9/10/1966