Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation K Borup, LN Andersen Advances in Neural Information Processing Systems 34, 5316-5327, 2021 | 8 | 2021 |
Automatic sleep scoring using patient-specific ensemble models and knowledge distillation for ear-EEG data K Borup, P Kidmose, H Phan, K Mikkelsen Biomedical signal processing and control 81, 104496, 2023 | 7 | 2023 |
Self-distillation for gaussian process regression and classification K Borup, LN Andersen arXiv preprint arXiv:2304.02641, 2023 | 2 | 2023 |
Distilling from Similar Tasks for Transfer Learning on a Budget K Borup, CP Phoo, B Hariharan arXiv preprint arXiv:2304.12314, 2023 | 1 | 2023 |
Distilling from Similar Tasks for Transfer Learning on a Budget K Borup, CP Phoo, B Hariharan Proceedings of the IEEE/CVF International Conference on Computer Vision …, 2023 | 1 | 2023 |
Knowledge Distillation Techniques for Machine Learning KE Borup Matematisk Institut, Aarhus Universitet, 2023 | | 2023 |
Self-Distillation for Gaussian Process Models K Borup, LN Andersen | | 2023 |
A Quest for Perfect Teacher-Student Agreement in Knowledge Distillation KE Borup | | 2023 |
Learning Efficient Models From Few Labels By Distillation From Multiple Tasks K Borup, CP Phoo, B Hariharan | | 2022 |
Supplementary Materials for Distilling from Similar Tasks for Transfer Learning on a Budget K Borup, CP Phoo, B Hariharan | | |