O Shamir - Conference on Learning Theory, 2022 - proceedings.mlr.press
The phenomenon of benign overfitting, where a predictor perfectly fits noisy training data while attaining low expected loss, has received much attention in recent years, but still …
F Suya, X Zhang, Y Tian… - Advances in neural …, 2024 - proceedings.neurips.cc
We study indiscriminate poisoning for linear learners where an adversary injects a few crafted examples into the training data with the goal of forcing the induced model to incur …
S Frei, Q Gu - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Although the optimization objectives for learning neural networks are highly non-convex, gradient-based methods have been wildly successful at learning neural networks in …
We study the fundamental problem of learning a single neuron, ie, a function of the form $\x\mapsto\sigma (\vec w\cdot\x) $ for monotone activations $\sigma:\R\mapsto\R $, with …
S Frei, D Zou, Z Chen, Q Gu - International Conference on …, 2022 - proceedings.mlr.press
We consider a binary classification problem when the data comes from a mixture of two rotationally symmetric distributions satisfying concentration and anti-concentration …
S Frei, Y Cao, Q Gu - International Conference on Machine …, 2021 - proceedings.mlr.press
We consider a one-hidden-layer leaky ReLU network of arbitrary width trained by stochastic gradient descent (SGD) following an arbitrary initialization. We prove that SGD produces …
We investigate approximation guarantees provided by logistic regression for the fundamental problem of agnostic learning of homogeneous halfspaces. Previously, for a …
We study indiscriminate poisoning for linear learners where an adversary injects a few crafted examples into the training data with the goal of forcing the induced model to incur …
O Shamir - Journal of Machine Learning Research, 2023 - jmlr.org
The phenomenon of benign overfitting, where a predictor perfectly fits noisy training data while attaining near-optimal expected loss, has received much attention in recent years, but …