We give the first tester-learner for halfspaces that succeeds universally over a wide class of structured distributions. Our universal tester-learner runs in fully polynomial time and has the …
We study the fundamental problem of learning a single neuron, ie, a function of the form $\x\mapsto\sigma (\vec w\cdot\x) $ for monotone activations $\sigma:\R\mapsto\R $, with …
We give the first efficient algorithm for learning halfspaces in the testable learning model recently defined by Rubinfeld and Vasilyan (2023). In this model, a learner certifies that the …
We study the problem of PAC learning halfspaces on ℝ d with Massart noise under the Gaussian distribution. In the Massart model, an adversary is allowed to flip the label of each …
A Forster transform is an operation that turns a multivariate distribution into one with good anti-concentration properties. While a Forster transform does not always exist, we show that …
J Shen, N Cui, J Wang - International conference on …, 2022 - proceedings.mlr.press
Active learning has become a prevalent technique for designing label-efficient algorithms, where the central principle is to only query and fit “informative” labeled instances. It is …
J Kelner, J Li, AX Liu, A Sidford… - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
Sparse recovery is one of the most fundamental and well-studied inverse problems. Standard statistical formulations of the problem are provably solved by general convex …
We consider a batch active learning scenario where the learner adaptively issues batches of points to a labeling oracle. Sampling labels in batches is highly desirable in practice due to …
C Gentile, Z Wang, T Zhang - International Conference on …, 2022 - proceedings.mlr.press
We consider a batch active learning scenario where the learner adaptively issues batches of points to a labeling oracle. Sampling labels in batches is highly desirable in practice due to …