Batch active learning at scale

G Citovsky, G DeSalvo, C Gentile… - Advances in …, 2021 - proceedings.neurips.cc
The ability to train complex and highly effective models often requires an abundance of
training data, which can easily become a bottleneck in cost, time, and computational …

Tester-learners for halfspaces: Universal algorithms

A Gollakota, A Klivans… - Advances in Neural …, 2024 - proceedings.neurips.cc
We give the first tester-learner for halfspaces that succeeds universally over a wide class of
structured distributions. Our universal tester-learner runs in fully polynomial time and has the …

Learning a single neuron with adversarial label noise via gradient descent

I Diakonikolas, V Kontonis… - … on Learning Theory, 2022 - proceedings.mlr.press
We study the fundamental problem of learning a single neuron, ie, a function of the form
$\x\mapsto\sigma (\vec w\cdot\x) $ for monotone activations $\sigma:\R\mapsto\R $, with …

An efficient tester-learner for halfspaces

A Gollakota, AR Klivans, K Stavropoulos… - arXiv preprint arXiv …, 2023 - arxiv.org
We give the first efficient algorithm for learning halfspaces in the testable learning model
recently defined by Rubinfeld and Vasilyan (2023). In this model, a learner certifies that the …

Learning general halfspaces with general massart noise under the gaussian distribution

I Diakonikolas, DM Kane, V Kontonis… - Proceedings of the 54th …, 2022 - dl.acm.org
We study the problem of PAC learning halfspaces on ℝ d with Massart noise under the
Gaussian distribution. In the Massart model, an adversary is allowed to flip the label of each …

Forster decomposition and learning halfspaces with noise

I Diakonikolas, D Kane… - Advances in Neural …, 2021 - proceedings.neurips.cc
A Forster transform is an operation that turns a multivariate distribution into one with good
anti-concentration properties. While a Forster transform does not always exist, we show that …

Metric-fair active learning

J Shen, N Cui, J Wang - International conference on …, 2022 - proceedings.mlr.press
Active learning has become a prevalent technique for designing label-efficient algorithms,
where the central principle is to only query and fit “informative” labeled instances. It is …

Semi-random sparse recovery in nearly-linear time

J Kelner, J Li, AX Liu, A Sidford… - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
Sparse recovery is one of the most fundamental and well-studied inverse problems.
Standard statistical formulations of the problem are provably solved by general convex …

Fast rates in pool-based batch active learning

C Gentile, Z Wang, T Zhang - Journal of Machine Learning Research, 2024 - jmlr.org
We consider a batch active learning scenario where the learner adaptively issues batches of
points to a labeling oracle. Sampling labels in batches is highly desirable in practice due to …

Achieving minimax rates in pool-based batch active learning

C Gentile, Z Wang, T Zhang - International Conference on …, 2022 - proceedings.mlr.press
We consider a batch active learning scenario where the learner adaptively issues batches of
points to a labeling oracle. Sampling labels in batches is highly desirable in practice due to …