Frugal optimization for cost-related hyperparameters

Q Wu, C Wang, S Huang - Proceedings of the AAAI Conference on …, 2021 - ojs.aaai.org
Proceedings of the AAAI Conference on Artificial Intelligence, 2021ojs.aaai.org
The increasing demand for democratizing machine learning algorithms calls for
hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms
have hyperparameters which can cause a large variation in the training cost. But this effect is
largely ignored in existing HPO methods, which are incapable to properly control cost during
the optimization process. To address this problem, we develop a new cost-frugal HPO
solution. The core of our solution is a simple but new randomized direct-search method, for …
Abstract
The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a new cost-frugal HPO solution. The core of our solution is a simple but new randomized direct-search method, for which we provide theoretical guarantees on the convergence rate and the total cost incurred to achieve convergence. We provide strong empirical results in comparison with state-of-the-art HPO methods on large AutoML benchmarks.
ojs.aaai.org
以上显示的是最相近的搜索结果。 查看全部搜索结果