N Carlini, S Chien, M Nasr, S Song… - … IEEE Symposium on …, 2022 - ieeexplore.ieee.org
A membership inference attack allows an adversary to query a trained machine learning model to predict whether or not a particular example was contained in the model's training …
M Paul, S Ganguli… - Advances in neural …, 2021 - proceedings.neurips.cc
Recent success in deep learning has partially been driven by training increasingly overparametrized networks on ever larger datasets. It is therefore natural to ask: how much …
Personalized federated learning (PFL) is an exciting approach that allows machine learning (ML) models to be trained on diverse and decentralized sources of data, while maintaining …
As a research community, we are still lacking a systematic understanding of the progress on adversarial robustness which often makes it hard to identify the most promising ideas in …
A central challenge in training classification models in the real-world federated system is learning with non-IID data. To cope with this, most of the existing works involve enforcing …
Deep learning's recent history has been one of achievement: from triumphing over humans in the game of Go to world-leading performance in image classification, voice recognition …
C He, M Annavaram… - Advances in Neural …, 2020 - proceedings.neurips.cc
Scaling up the convolutional neural network (CNN) size (eg, width, depth, etc.) is known to effectively improve model accuracy. However, the large model size impedes training on …
Federated learning (FL) is a rapidly growing research field in machine learning. However, existing FL libraries cannot adequately support diverse algorithmic development; …
F Tung, G Mori - Proceedings of the IEEE/CVF international …, 2019 - openaccess.thecvf.com
Abstract Knowledge distillation is a widely applicable technique for training a student neural network under the guidance of a trained teacher network. For example, in neural network …