Attentive neural processes

H Kim, A Mnih, J Schwarz, M Garnelo, A Eslami… - arXiv preprint arXiv …, 2019 - arxiv.org
Neural Processes (NPs)(Garnelo et al 2018a; b) approach regression by learning to map a
context set of observed input-output pairs to a distribution over regression functions. Each …

Meta reinforcement learning with latent variable gaussian processes

S Sæmundsson, K Hofmann, MP Deisenroth - arXiv preprint arXiv …, 2018 - arxiv.org
Learning from small data sets is critical in many practical applications where data collection
is time consuming or expensive, eg, robotics, animal experiments or drug design. Meta …

Bayesian optimization with high-dimensional outputs

WJ Maddox, M Balandat, AG Wilson… - Advances in neural …, 2021 - proceedings.neurips.cc
Bayesian optimization is a sample-efficient black-box optimization procedure that is typically
applied to a small number of independent objectives. However, in practice we often wish to …

Gaussian process conditional density estimation

V Dutordoir, H Salimbeni, J Hensman… - Advances in neural …, 2018 - proceedings.neurips.cc
Abstract Conditional Density Estimation (CDE) models deal with estimating conditional
distributions. The conditions imposed on the distribution are the inputs of the model. CDE is …

Deep Gaussian processes with importance-weighted variational inference

H Salimbeni, V Dutordoir, J Hensman… - International …, 2019 - proceedings.mlr.press
Abstract Deep Gaussian processes (DGPs) can model complex marginal densities as well
as complex mappings. Non-Gaussian marginals are essential for modelling real-world data …

Meta-surrogate benchmarking for hyperparameter optimization

A Klein, Z Dai, F Hutter, N Lawrence… - Advances in Neural …, 2019 - proceedings.neurips.cc
Despite the recent progress in hyperparameter optimization (HPO), available benchmarks
that resemble real-world scenarios consist of a few and very large problem instances that …

The gaussian process autoregressive regression model (gpar)

J Requeima, W Tebbutt, W Bruinsma… - The 22nd …, 2019 - proceedings.mlr.press
Multi-output regression models must exploit dependencies between outputs to maximise
predictive performance. The application of Gaussian processes (GPs) to this setting typically …

Volatility based kernels and moving average means for accurate forecasting with gaussian processes

G Benton, W Maddox… - … Conference on Machine …, 2022 - proceedings.mlr.press
A broad class of stochastic volatility models are defined by systems of stochastic differential
equations, and while these models have seen widespread success in domains such as …

Fast transfer Gaussian process regression with large-scale sources

B Da, YS Ong, A Gupta, L Feng, H Liu - Knowledge-Based Systems, 2019 - Elsevier
In transfer learning, we aim to improve the predictive modeling of a target output by using the
knowledge from some related source outputs. In real-world applications, the data from the …

Large scale multi-output multi-class classification using Gaussian processes

C Ma, MA Álvarez - Machine Learning, 2023 - Springer
Abstract Multi-output Gaussian processes (MOGPs) can help to improve predictive
performance for some output variables, by leveraging the correlation with other output …