We present Theseus, an efficient application-agnostic open source library for differentiable nonlinear least squares (DNLS) optimization built on PyTorch, providing a common …
The popularity of deep learning has led to the curation of a vast number of massive and multifarious datasets. Despite having close-to-human performance on individual tasks …
We propose a new dataset distillation algorithm using reparameterization and convexification of implicit gradients (RCIG), that substantially improves the state-of-the-art …
M Blondel, V Roulet - arXiv preprint arXiv:2403.14606, 2024 - arxiv.org
Artificial intelligence has recently experienced remarkable advances, fueled by large models, vast datasets, accelerated hardware, and, last but not least, the transformative …
While deep learning models have replaced hand-designed features across many domains, these models are still trained with hand-designed optimizers. In this work, we leverage the …
RT Lange - Proceedings of the Companion Conference on Genetic …, 2023 - dl.acm.org
The deep learning revolution has greatly been accelerated by the'hardware lottery': Recent advances in modern hardware accelerators, compilers and the availability of open-source …
The well-designed structures in neural networks reflect the prior knowledge incorporated into the models. However, though different models have various priors, we are used to …
B Amos - Foundations and Trends® in Machine Learning, 2023 - nowpublishers.com
Optimization is a ubiquitous modeling tool and is often deployed in settings which repeatedly solve similar instances of the same problem. Amortized optimization methods …
Optimizing functions without access to gradients is the remit of black-box methods such as evolution strategies. While highly general, their learning dynamics are often times heuristic …