Knowledge Distillation
Tags
1 page
Knowledge Distillation
AI Model Compression: Running Large Models on Small Devices