Many problems in machine learning involve bilevel optimization (BLO), including hyperparameter optimization, meta-learning, and dataset distillation. Bilevel problems …
Unrolled computation graphs arise in many scenarios, including training RNNs, tuning hyperparameters through unrolled optimization, and training learned optimizers. Current …
C Chen, Y Zhang, X Liu… - … Conference on Machine …, 2023 - proceedings.mlr.press
Offline model-based optimization aims to maximize a black-box objective function with a static dataset of designs and their scores. In this paper, we focus on biological sequence …
O Bohdal, Y Yang… - Advances in neural …, 2021 - proceedings.neurips.cc
Gradient-based meta-learning and hyperparameter optimization have seen significant progress recently, enabling practical end-to-end training of neural networks together with …
W Zhou, Y Li, Y Yang, H Wang… - Advances in neural …, 2020 - proceedings.neurips.cc
Abstract Off-Policy Actor-Critic (OffP-AC) methods have proven successful in a variety of continuous control tasks. Normally, the critic's action-value function is updated using …
Drawing inspiration from gradient-based meta-learning methods with infinitely small gradient steps, we introduce Continuous-Time Meta-Learning (COMLN), a meta-learning …
P Vicol - International Conference on Machine Learning, 2023 - proceedings.mlr.press
We propose an evolution strategies-based algorithm for estimating gradients in unrolled computation graphs, called ES-Single. Similarly to the recently-proposed Persistent …
We propose a framework for online meta-optimization of parameters that govern optimization, called Amortized Proximal Optimization (APO). We first interpret various …
B Zhao, J Wu, Y Ma, C Yang - IEEE Open Journal of the …, 2024 - ieeexplore.ieee.org
Deep learning has been used for optimizing a multitude of wireless problems. Yet most existing works assume that training and test samples are drawn from the same distribution …