[HTML][HTML] Principles and practice of explainable machine learning

V Belle, I Papantonis - Frontiers in big Data, 2021 - frontiersin.org
Artificial intelligence (AI) provides many opportunities to improve private and public life.
Discovering patterns and structures in large troves of data in an automated manner is a core …

A comprehensive survey on test-time adaptation under distribution shifts

J Liang, R He, T Tan - arXiv preprint arXiv:2303.15361, 2023 - arxiv.org
Machine learning methods strive to acquire a robust model during training that can
generalize well to test samples, even under distribution shifts. However, these methods often …

Data-free knowledge distillation for heterogeneous federated learning

Z Zhu, J Hong, J Zhou - International conference on machine …, 2021 - proceedings.mlr.press
Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global
server iteratively averages the model parameters of local users without accessing their data …

A survey of machine unlearning

TT Nguyen, TT Huynh, PL Nguyen, AWC Liew… - arXiv preprint arXiv …, 2022 - arxiv.org
Today, computer systems hold large amounts of personal data. Yet while such an
abundance of data allows breakthroughs in artificial intelligence, and especially machine …

Knowledge distillation with the reused teacher classifier

D Chen, JP Mei, H Zhang, C Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …

See through gradients: Image batch recovery via gradinversion

H Yin, A Mallya, A Vahdat, JM Alvarez… - Proceedings of the …, 2021 - openaccess.thecvf.com
Training deep neural networks requires gradient estimation from data batches to update
parameters. Gradients per parameter are averaged over a set of data and this has been …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Ensemble distillation for robust model fusion in federated learning

T Lin, L Kong, SU Stich, M Jaggi - Advances in neural …, 2020 - proceedings.neurips.cc
Federated Learning (FL) is a machine learning setting where many devices collaboratively
train a machine learning model while keeping the training data decentralized. In most of the …

Source-free domain adaptation for semantic segmentation

Y Liu, W Zhang, J Wang - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
Abstract Unsupervised Domain Adaptation (UDA) can tackle the challenge that
convolutional neural network (CNN)-based approaches for semantic segmentation heavily …

Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …