Tester-learners for halfspaces: Universal algorithms

A Gollakota, A Klivans… - Advances in Neural …, 2024 - proceedings.neurips.cc
We give the first tester-learner for halfspaces that succeeds universally over a wide class of
structured distributions. Our universal tester-learner runs in fully polynomial time and has the …

Time/accuracy tradeoffs for learning a relu with respect to gaussian marginals

S Goel, S Karmalkar, A Klivans - Advances in neural …, 2019 - proceedings.neurips.cc
We consider the problem of computing the best-fitting ReLU with respect to square-loss on a
training set when the examples have been drawn according to a spherical Gaussian …

Improved algorithms for neural active learning

Y Ban, Y Zhang, H Tong… - Advances in Neural …, 2022 - proceedings.neurips.cc
We improve the theoretical and empirical performance of neural-network (NN)-based active
learning algorithms for the non-parametric streaming setting. In particular, we introduce two …

An efficient tester-learner for halfspaces

A Gollakota, AR Klivans, K Stavropoulos… - arXiv preprint arXiv …, 2023 - arxiv.org
We give the first efficient algorithm for learning halfspaces in the testable learning model
recently defined by Rubinfeld and Vasilyan (2023). In this model, a learner certifies that the …

Improved algorithms for efficient active learning halfspaces with massart and tsybakov noise

C Zhang, Y Li - Conference on Learning Theory, 2021 - proceedings.mlr.press
We give a computationally-efficient PAC active learning algorithm for $ d $-dimensional
homogeneous halfspaces that can tolerate Massart noise (Massart and Nedelec, 2006) and …

Efficient active learning of sparse halfspaces with arbitrary bounded noise

C Zhang, J Shen, P Awasthi - Advances in Neural …, 2020 - proceedings.neurips.cc
We study active learning of homogeneous $ s $-sparse halfspaces in $\mathbb {R}^ d $
under the setting where the unlabeled data distribution is isotropic log-concave and each …

Metric-fair active learning

J Shen, N Cui, J Wang - International conference on …, 2022 - proceedings.mlr.press
Active learning has become a prevalent technique for designing label-efficient algorithms,
where the central principle is to only query and fit “informative” labeled instances. It is …

Region-based active learning

C Cortes, G DeSalvo, C Gentile… - The 22nd …, 2019 - proceedings.mlr.press
We study a scenario of active learning where the input space is partitioned into different
regions and where a distinct hypothesis is learned for each region. We first introduce a new …

Divtheft: An ensemble model stealing attack by divide-and-conquer

Z Ma, X Liu, Y Liu, X Liu, Z Qin… - IEEE transactions on …, 2023 - ieeexplore.ieee.org
Recently, model stealing attacks are widely studied but most of them are focused on stealing
a single non-discrete model, eg, neural networks. For ensemble models, these attacks are …

Online active learning with surrogate loss functions

G DeSalvo, C Gentile, TS Thune - Advances in neural …, 2021 - proceedings.neurips.cc
We derive a novel active learning algorithm in the streaming setting for binary classification
tasks. The algorithm leverages weak labels to minimize the number of label requests, and …