Taking the human out of the loop: A review of Bayesian optimization

B Shahriari, K Swersky, Z Wang… - Proceedings of the …, 2015 - ieeexplore.ieee.org
Big Data applications are typically associated with systems involving large numbers of
users, massive complex software systems, and large-scale heterogeneous computing and …

Expected improvement for expensive optimization: a review

D Zhan, H Xing - Journal of Global Optimization, 2020 - Springer
The expected improvement (EI) algorithm is a very popular method for expensive
optimization problems. In the past twenty years, the EI criterion has been extended to deal …

A survey of automatic parameter tuning methods for metaheuristics

C Huang, Y Li, X Yao - IEEE transactions on evolutionary …, 2019 - ieeexplore.ieee.org
Parameter tuning, that is, to find appropriate parameter settings (or configurations) of
algorithms so that their performance is optimized, is an important task in the development …

[HTML][HTML] A prescription of methodological guidelines for comparing bio-inspired optimization algorithms

A LaTorre, D Molina, E Osaba, J Poyatos… - Swarm and Evolutionary …, 2021 - Elsevier
Bio-inspired optimization (including Evolutionary Computation and Swarm Intelligence) is a
growing research topic with many competitive bio-inspired algorithms being proposed every …

Exploration and exploitation in evolutionary algorithms: A survey

M Črepinšek, SH Liu, M Mernik - ACM computing surveys (CSUR), 2013 - dl.acm.org
“Exploration and exploitation are the two cornerstones of problem solving by search.” For
more than a decade, Eiben and Schippers' advocacy for balancing between these two …

[HTML][HTML] The irace package: Iterated racing for automatic algorithm configuration

M López-Ibáñez, J Dubois-Lacoste, LP Cáceres… - Operations Research …, 2016 - Elsevier
Modern optimization algorithms typically require the setting of a large number of parameters
to optimize their performance. The immediate goal of automatic algorithm configuration is to …

Sequential model-based optimization for general algorithm configuration

F Hutter, HH Hoos, K Leyton-Brown - … , LION 5, Rome, Italy, January 17-21 …, 2011 - Springer
State-of-the-art algorithms for hard computational problems often expose many parameters
that can be modified to improve empirical performance. However, manually exploring the …

A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning

E Brochu, VM Cora, N De Freitas - arXiv preprint arXiv:1012.2599, 2010 - arxiv.org
We present a tutorial on Bayesian optimization, a method of finding the maximum of
expensive cost functions. Bayesian optimization employs the Bayesian technique of setting …

Online batch selection for faster training of neural networks

I Loshchilov, F Hutter - arXiv preprint arXiv:1511.06343, 2015 - arxiv.org
Deep neural networks are commonly trained using stochastic non-convex optimization
procedures, which are driven by gradient information estimated on fractions (batches) of the …

Benchmarking in optimization: Best practice and open issues

T Bartz-Beielstein, C Doerr, D Berg, J Bossek… - arXiv preprint arXiv …, 2020 - arxiv.org
This survey compiles ideas and recommendations from more than a dozen researchers with
different backgrounds and from different institutes around the world. Promoting best practice …