Y Xing, Q Song, G Cheng - Advances in Neural Information …, 2022 - proceedings.neurips.cc
In the adversarial training framework of\cite {carmon2019unlabeled, gowal2021improving}, people use generated/real unlabeled data with pseudolabels to improve adversarial …
Y Xing, Q Song, G Cheng - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Adversarial training is one important algorithm to achieve robust machine learning models. However, numerous empirical results show a great performance degradation from clean …
Y Yang, T Wang, JP Woolard, W Xiang - Neural Networks, 2022 - Elsevier
Approximation error is a key measure in the process of model validation and verification for neural networks. In this paper, the problems of guaranteed error estimation of neural …
Neural networks are becoming increasingly popular in applications, but our mathematical understanding of their potential and limitations is still limited. In this paper, we further this …
It has been observed that certain loss functions can render deep-learning pipelines robust against flaws in the data. In this paper, we support these empirical findings with statistical …
L Xu, F Yao, Q Yao, H Zhang - Journal of Machine Learning Research, 2023 - jmlr.org
There has been a surge of interest in developing robust estimators for models with heavy- tailed and bounded variance data in statistics and machine learning, while few works …
Sparsity has become popular in machine learning because it can save computational resources, facilitate interpretations, and prevent overfitting. This paper discusses sparsity in …
Neural networks have become standard tools in many areas, yet many important statistical questions remain open. This paper studies the question of how much data are needed to …
We study the Gibbs posterior distribution from PAC-Bayes theory for sparse deep neural nets in a nonparametric regression setting. To access the posterior distribution, an efficient …